Directory of Open Access Journals (Sweden)
Sadreyev Ruslan I
2004-08-01
Full Text Available Abstract Background Profile-based analysis of multiple sequence alignments (MSA allows for accurate comparison of protein families. Here, we address the problems of detecting statistically confident dissimilarities between (1 MSA position and a set of predicted residue frequencies, and (2 between two MSA positions. These problems are important for (i evaluation and optimization of methods predicting residue occurrence at protein positions; (ii detection of potentially misaligned regions in automatically produced alignments and their further refinement; and (iii detection of sites that determine functional or structural specificity in two related families. Results For problems (1 and (2, we propose analytical estimates of P-value and apply them to the detection of significant positional dissimilarities in various experimental situations. (a We compare structure-based predictions of residue propensities at a protein position to the actual residue frequencies in the MSA of homologs. (b We evaluate our method by the ability to detect erroneous position matches produced by an automatic sequence aligner. (c We compare MSA positions that correspond to residues aligned by automatic structure aligners. (d We compare MSA positions that are aligned by high-quality manual superposition of structures. Detected dissimilarities reveal shortcomings of the automatic methods for residue frequency prediction and alignment construction. For the high-quality structural alignments, the dissimilarities suggest sites of potential functional or structural importance. Conclusion The proposed computational method is of significant potential value for the analysis of protein families.
Statistically significant relational data mining :
Energy Technology Data Exchange (ETDEWEB)
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Statistical Significance for Hierarchical Clustering
Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.
2017-01-01
Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Statistical significance of cis-regulatory modules
Directory of Open Access Journals (Sweden)
Smith Andrew D
2007-01-01
Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.
The thresholds for statistical and clinical significance
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Swiss solar power statistics 2007 - Significant expansion
International Nuclear Information System (INIS)
Hostettler, T.
2008-01-01
This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.
Significant Statistics: Viewed with a Contextual Lens
Tait-McCutcheon, Sandi
2010-01-01
This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…
Test for the statistical significance of differences between ROC curves
International Nuclear Information System (INIS)
Metz, C.E.; Kronman, H.B.
1979-01-01
A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions
A tutorial on hunting statistical significance by chasing N
Directory of Open Access Journals (Sweden)
Denes Szucs
2016-09-01
Full Text Available There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticise some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post-hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.
Testing the Difference of Correlated Agreement Coefficients for Statistical Significance
Gwet, Kilem L.
2016-01-01
This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
Caveats for using statistical significance tests in research assessments
DEFF Research Database (Denmark)
Schneider, Jesper Wiborg
2013-01-01
controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...
On detection and assessment of statistical significance of Genomic Islands
Directory of Open Access Journals (Sweden)
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
Your Chi-Square Test Is Statistically Significant: Now What?
Sharpe, Donald
2015-01-01
Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…
Statistical Significance and Effect Size: Two Sides of a Coin.
Fan, Xitao
This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…
Reporting effect sizes as a supplement to statistical significance ...
African Journals Online (AJOL)
The purpose of the article is to review the statistical significance reporting practices in reading instruction studies and to provide guidelines for when to calculate and report effect sizes in educational research. A review of six readily accessible (online) and accredited journals publishing research on reading instruction ...
Increasing the statistical significance of entanglement detection in experiments.
Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei
2010-05-28
Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.
PA positioning significantly reduces testicular dose during sacroiliac joint radiography
Energy Technology Data Exchange (ETDEWEB)
Mekis, Nejc [Faculty of Health Sciences, University of Ljubljana (Slovenia); Mc Entee, Mark F., E-mail: mark.mcentee@ucd.i [School of Medicine and Medical Science, University College Dublin 4 (Ireland); Stegnar, Peter [Jozef Stefan International Postgraduate School, Ljubljana (Slovenia)
2010-11-15
Radiation dose to the testes in the antero-posterior (AP) and postero-anterior (PA) projection of the sacroiliac joint (SIJ) was measured with and without a scrotal shield. Entrance surface dose, the dose received by the testicles and the dose area product (DAP) was used. DAP measurements revealed the dose received by the phantom in the PA position is 12.6% lower than the AP (p {<=} 0.009) with no statistically significant reduction in image quality (p {<=} 0.483). The dose received by the testes in the PA projection in SIJ imaging is 93.1% lower than the AP projection when not using protection (p {<=} 0.020) and 94.9% lower with protection (p {<=} 0.019). The dose received by the testicles was not changed by the use of a scrotal shield in the AP position (p {<=} 0.559); but was lowered by its use in the PA (p {<=} 0.058). Use of the PA projection in SIJ imaging significantly lowers, the dose received by the testes compared to the AP projection without significant loss of image quality.
PA positioning significantly reduces testicular dose during sacroiliac joint radiography
International Nuclear Information System (INIS)
Mekis, Nejc; Mc Entee, Mark F.; Stegnar, Peter
2010-01-01
Radiation dose to the testes in the antero-posterior (AP) and postero-anterior (PA) projection of the sacroiliac joint (SIJ) was measured with and without a scrotal shield. Entrance surface dose, the dose received by the testicles and the dose area product (DAP) was used. DAP measurements revealed the dose received by the phantom in the PA position is 12.6% lower than the AP (p ≤ 0.009) with no statistically significant reduction in image quality (p ≤ 0.483). The dose received by the testes in the PA projection in SIJ imaging is 93.1% lower than the AP projection when not using protection (p ≤ 0.020) and 94.9% lower with protection (p ≤ 0.019). The dose received by the testicles was not changed by the use of a scrotal shield in the AP position (p ≤ 0.559); but was lowered by its use in the PA (p ≤ 0.058). Use of the PA projection in SIJ imaging significantly lowers, the dose received by the testes compared to the AP projection without significant loss of image quality.
Directory of Open Access Journals (Sweden)
Priya Ranganathan
2015-01-01
Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958
Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies
International Nuclear Information System (INIS)
Weber, K.H.
1993-01-01
In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de
Systematic reviews of anesthesiologic interventions reported as statistically significant
DEFF Research Database (Denmark)
Imberger, Georgina; Gluud, Christian; Boylan, John
2015-01-01
statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...
Increasing the statistical significance of entanglement detection in experiments
Energy Technology Data Exchange (ETDEWEB)
Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)
2010-07-01
Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.
Kramer, Karen L; Veile, Amanda; Otárola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.
Directory of Open Access Journals (Sweden)
Karen L Kramer
Full Text Available Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1 as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2 competition from young siblings will negatively impact child growth during the post weaning period; 3 however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.
Comparative analysis of positive and negative attitudes toward statistics
Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah
2015-02-01
Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.
Van Aert, R.C.M.; Van Assen, M.A.L.M.
2018-01-01
The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter
Conducting tests for statistically significant differences using forest inventory data
James A. Westfall; Scott A. Pugh; John W. Coulston
2013-01-01
Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2016-04-26
Systems, Statistics & Management Science, University of Alabama, USA. 1 DISTRIBUTION A: Distribution approved for public release. Contents 1 Summary 5...13 5 Application to Real Networks 18 5.1 2012 FBS Football Schedule Network... football schedule network. . . . . . . . . . . . . . . . . . . . . . 21 14 Stem plot of degree-ordered vertices versus the degree for college football
Health significance and statistical uncertainty. The value of P-value.
Consonni, Dario; Bertazzi, Pier Alberto
2017-10-27
The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".
Sierevelt, Inger N.; van Oldenrijk, Jakob; Poolman, Rudolf W.
2007-01-01
In this paper we describe several issues that influence the reporting of statistical significance in relation to clinical importance, since misinterpretation of p values is a common issue in orthopaedic literature. Orthopaedic research is tormented by the risks of false-positive (type I error) and
After statistics reform : Should we still teach significance testing?
A. Hak (Tony)
2014-01-01
textabstractIn the longer term null hypothesis significance testing (NHST) will disappear because p- values are not informative and not replicable. Should we continue to teach in the future the procedures of then abolished routines (i.e., NHST)? Three arguments are discussed for not teaching NHST in
Zhang, Zhang
2012-03-22
Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.
Directory of Open Access Journals (Sweden)
Zhang Zhang
2012-03-01
Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.
Wilkinson, Michael
2014-03-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.
Statistical convergence of a non-positive approximation process
International Nuclear Information System (INIS)
Agratini, Octavian
2011-01-01
Highlights: → A general class of approximation processes is introduced. → The A-statistical convergence is studied. → Applications in quantum calculus are delivered. - Abstract: Starting from a general sequence of linear and positive operators of discrete type, we associate its r-th order generalization. This construction involves high order derivatives of a signal and it looses the positivity property. Considering that the initial approximation process is A-statistically uniform convergent, we prove that the property is inherited by the new sequence. Also, our result includes information about the uniform convergence. Two applications in q-Calculus are presented. We study q-analogues both of Meyer-Koenig and Zeller operators and Stancu operators.
Statistical analysis of RHIC beam position monitors performance
Calaga, R.; Tomás, R.
2004-04-01
A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.
Statistical analysis of RHIC beam position monitors performance
Directory of Open Access Journals (Sweden)
R. Calaga
2004-04-01
Full Text Available A detailed statistical analysis of beam position monitors (BPM performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
Significance of Waterway Navigation Positioning Systems On Ship's Manoeuvring Safety
Galor, W.
The main goal of navigation is to lead the ship to the point of destination safety and efficiently. Various factors may affect ship realisating this process. The ship movement on waterway are mainly limited by water area dimensions (surface and depth). These limitations cause the requirement to realise the proper of ship movement trajectory. In case when this re requirement cant't fulfil then marine accident may happend. This fact is unwanted event caused losses of human health and life, damage or loss of cargo and ship, pollution of natural environment, damage of port structures or blocking the port of its ports and lost of salvage operation. These losses in same cases can be catas- trophical especially while e.i. crude oil spilling could be place. To realise of safety navigation process is needed to embrace the ship's movement trajectory by waterways area. The ship's trajectory is described by manoeuvring lane as a surface of water area which is require to realise of safety ship movement. Many conditions affect to ship manoeuvring line. The main are following: positioning accuracy, ship's manoeuvring features and phenomena's of shore and ship's bulk common affecting. The accuracy of positioning system is most important. This system depends on coast navigation mark- ing which can range many kinds of technical realisation. Mainly used systems based on lights (line), radionavigation (local system or GPS, DGPS), or radars. If accuracy of positiong is higer, then safety of navigation is growing. This article presents these problems exemplifying with approaching channel to ports situated on West Pomera- nian water region.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza
2014-01-01
This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…
Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.
2013-01-01
Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
DEFF Research Database (Denmark)
Engsted, Tom
I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...
Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior
2011-09-23
Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.
Statistical basis for positive identification in forensic anthropology.
Steadman, Dawnie Wolfe; Adams, Bradley J; Konigsberg, Lyle W
2006-09-01
Forensic scientists are often expected to present the likelihood of DNA identifications in US courts based on comparative population data, yet forensic anthropologists tend not to quantify the strength of an osteological identification. Because forensic anthropologists are trained first and foremost as physical anthropologists, they emphasize estimation problems at the expense of evidentiary problems, but this approach must be reexamined. In this paper, the statistical bases for presenting osteological and dental evidence are outlined, using a forensic case as a motivating example. A brief overview of Bayesian statistics is provided, and methods to calculate likelihood ratios for five aspects of the biological profile are demonstrated. This paper emphasizes the definition of appropriate reference samples and of the "population at large," and points out the conceptual differences between them. Several databases are introduced for both reference information and to characterize the "population at large," and new data are compiled to calculate the frequency of specific characters, such as age or fractures, within the "population at large." Despite small individual likelihood ratios for age, sex, and stature in the case example, the power of this approach is that, assuming each likelihood ratio is independent, the product rule can be applied. In this particular example, it is over three million times more likely to obtain the observed osteological and dental data if the identification is correct than if the identification is incorrect. This likelihood ratio is a convincing statistic that can support the forensic anthropologist's opinion on personal identity in court. 2006 Wiley-Liss, Inc.
Computational and Statistical Aspects of Determining Ship’s Position
Directory of Open Access Journals (Sweden)
Drapella Antoni
2017-12-01
Full Text Available In its mathematical essence, the task of determining ship’s position coordinates, is to minimize appropriately defined goal function. This paper proposes to use the method of conjugate gradient for this purpose. The reason is that calculations may be performed in some seconds time because Microsoft and Apache implemented the conjugate gradient method as a tool called the Solver and embedded this tool in their widely offered and popular spreadsheets, namely Excel and the Open Office Calc, respectively. Further in this paper it is shown how to precisely assess errors of ship’s position coordinates with the Monte Carlo method that employs the Solver.
Directory of Open Access Journals (Sweden)
Melissa Coulson
2010-07-01
Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
Farrell, Mary Beth
2018-06-01
This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being
Moore, Sarah J; Herst, Patries M; Louwe, Robert J W
2018-05-01
A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.
Statistical significance of trends in monthly heavy precipitation over the US
Mahajan, Salil
2011-05-11
Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.
Deegear, James
This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…
Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.
2011-01-01
In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…
van Tulder, M.W.; Malmivaara, A.; Hayden, J.; Koes, B.
2007-01-01
STUDY DESIGN. Critical appraisal of the literature. OBJECIVES. The objective of this study was to assess if results of back pain trials are statistically significant and clinically important. SUMMARY OF BACKGROUND DATA. There seems to be a discrepancy between conclusions reported by authors and
P-Value, a true test of statistical significance? a cautionary note ...
African Journals Online (AJOL)
While it's not the intention of the founders of significance testing and hypothesis testing to have the two ideas intertwined as if they are complementary, the inconvenient marriage of the two practices into one coherent, convenient, incontrovertible and misinterpreted practice has dotted our standard statistics textbooks and ...
DEFF Research Database (Denmark)
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...
Prat, Aleix; Cheang, Maggie Chon U.; Martín, Miguel; Parker, Joel S.; Carrasco, Eva; Caballero, Rosalía; Tyldesley, Scott; Gelmon, Karen; Bernard, Philip S.; Nielsen, Torsten O.; Perou, Charles M.
2013-01-01
Purpose Current immunohistochemical (IHC)-based definitions of luminal A and B breast cancers are imperfect when compared with multigene expression-based assays. In this study, we sought to improve the IHC subtyping by examining the pathologic and gene expression characteristics of genomically defined luminal A and B subtypes. Patients and Methods Gene expression and pathologic features were collected from primary tumors across five independent cohorts: British Columbia Cancer Agency (BCCA) tamoxifen-treated only, Grupo Español de Investigación en Cáncer de Mama 9906 trial, BCCA no systemic treatment cohort, PAM50 microarray training data set, and a combined publicly available microarray data set. Optimal cutoffs of percentage of progesterone receptor (PR) –positive tumor cells to predict survival were derived and independently tested. Multivariable Cox models were used to test the prognostic significance. Results Clinicopathologic comparisons among luminal A and B subtypes consistently identified higher rates of PR positivity, human epidermal growth factor receptor 2 (HER2) negativity, and histologic grade 1 in luminal A tumors. Quantitative PR gene and protein expression were also found to be significantly higher in luminal A tumors. An empiric cutoff of more than 20% of PR-positive tumor cells was statistically chosen and proved significant for predicting survival differences within IHC-defined luminal A tumors independently of endocrine therapy administration. Finally, no additional prognostic value within hormonal receptor (HR) –positive/HER2-negative disease was observed with the use of the IHC4 score when intrinsic IHC-based subtypes were used that included the more than 20% PR-positive tumor cells and vice versa. Conclusion Semiquantitative IHC expression of PR adds prognostic value within the current IHC-based luminal A definition by improving the identification of good outcome breast cancers. The new proposed IHC-based definition of luminal A
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...
Testing statistical significance scores of sequence comparison methods with structure similarity
Directory of Open Access Journals (Sweden)
Leunissen Jack AM
2006-10-01
Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.
Diedrich, Alice; Schlegl, Sandra; Greetfeld, Martin; Fumi, Markus; Voderholzer, Ulrich
2018-03-01
This study examines the statistical and clinical significance of symptom changes during an intensive inpatient treatment program with a strong psychotherapeutic focus for individuals with severe bulimia nervosa. 295 consecutively admitted bulimic patients were administered the Structured Interview for Anorexic and Bulimic Syndromes-Self-Rating (SIAB-S), the Eating Disorder Inventory-2 (EDI-2), the Brief Symptom Inventory (BSI), and the Beck Depression Inventory-II (BDI-II) at treatment intake and discharge. Results indicated statistically significant symptom reductions with large effect sizes regarding severity of binge eating and compensatory behavior (SIAB-S), overall eating disorder symptom severity (EDI-2), overall psychopathology (BSI), and depressive symptom severity (BDI-II) even when controlling for antidepressant medication. The majority of patients showed either reliable (EDI-2: 33.7%, BSI: 34.8%, BDI-II: 18.1%) or even clinically significant symptom changes (EDI-2: 43.2%, BSI: 33.9%, BDI-II: 56.9%). Patients with clinically significant improvement were less distressed at intake and less likely to suffer from a comorbid borderline personality disorder when compared with those who did not improve to a clinically significant extent. Findings indicate that intensive psychotherapeutic inpatient treatment may be effective in about 75% of severely affected bulimic patients. For the remaining non-responding patients, inpatient treatment might be improved through an even stronger focus on the reduction of comorbid borderline personality traits.
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Statistical significance estimation of a signal within the GooFit framework on GPUs
Directory of Open Access Journals (Sweden)
Cristella Leonardo
2017-01-01
Full Text Available In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
International Nuclear Information System (INIS)
DUDEK, J; SZPAK, B; FORNAL, B; PORQUET, M-G
2011-01-01
In this and the follow-up article we briefly discuss what we believe represents one of the most serious problems in contemporary nuclear structure: the question of statistical significance of parametrizations of nuclear microscopic Hamiltonians and the implied predictive power of the underlying theories. In the present Part I, we introduce the main lines of reasoning of the so-called Inverse Problem Theory, an important sub-field in the contemporary Applied Mathematics, here illustrated on the example of the Nuclear Mean-Field Approach.
Directory of Open Access Journals (Sweden)
Niangang Jiao
2018-05-01
Full Text Available With the increasing demand for high-resolution remote sensing images for mapping and monitoring the Earth’s environment, geometric positioning accuracy improvement plays a significant role in the image preprocessing step. Based on the statistical learning theory, we propose a new method to improve the geometric positioning accuracy without ground control points (GCPs. Multi-temporal images from the ZY-3 satellite are tested and the bias-compensated rational function model (RFM is applied as the block adjustment model in our experiment. An easy and stable weight strategy and the fast iterative shrinkage-thresholding (FIST algorithm which is widely used in the field of compressive sensing are improved and utilized to define the normal equation matrix and solve it. Then, the residual errors after traditional block adjustment are acquired and tested with the newly proposed inherent error compensation model based on statistical learning theory. The final results indicate that the geometric positioning accuracy of ZY-3 satellite imagery can be improved greatly with our proposed method.
A Note on Comparing the Power of Test Statistics at Low Significance Levels.
Morris, Nathan; Elston, Robert
2011-01-01
It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.
Energy Technology Data Exchange (ETDEWEB)
Crow, C.J.
1985-01-01
Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.
Kellerer-Pirklbauer, Andreas
2016-04-01
Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter
The Health Significance of Positive Emotions in Adulthood and Later Life.
Ong, Anthony D; Mroczek, Daniel K; Riffin, Catherine
2011-08-01
A growing body of literature supports a link between positive emotions and health in older adults. In this article, we review evidence of the effects of positive emotions on downstream biological processes and meaningful clinical endpoints, such as adult morbidity and mortality. We then present relevant predictions from lifespan theories that suggest changes in cognition and motivation may play an important role in explaining how positive emotions are well maintained in old age, despite pervasive declines in cognitive processes. We conclude by discussing how the application of psychological theory can inform greater understanding of the adaptive significance of positive emotions in adulthood and later life.
Cucherat, Michel; Laporte, Silvy
2017-09-01
The use of statistical test is central in the clinical trial. At the statistical level, obtaining a Pinformation about the plausibility of the existence of treatment effect. With "Pfalse positive is very high. This is the case if the power is low, if there is an inflation of the alpha risk or if the result is exploratory or chance discoveries. This possibility is important to take into consideration when interpreting the results of clinical trials in order to avoid pushing ahead significant results in appearance, but which are likely to be actually false positive results. Copyright © 2017 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Indirectional statistics and the significance of an asymmetry discovered by Birch
International Nuclear Information System (INIS)
Kendall, D.G.; Young, G.A.
1984-01-01
Birch (1982, Nature, 298, 451) reported an apparent 'statistical asymmetry of the Universe'. The authors here develop 'indirectional analysis' as a technique for investigating statistical effects of this kind and conclude that the reported effect (whatever may be its origin) is strongly supported by the observations. The estimated pole of the asymmetry is at RA 13h 30m, Dec. -37deg. The angular error in its estimation is unlikely to exceed 20-30deg. (author)
Directory of Open Access Journals (Sweden)
Mashhood Ahmed Sheikh
2017-08-01
mediate the association between childhood adversity and ADS in adulthood. However, when education was excluded as a mediator-response confounding variable, the indirect effect of childhood adversity on ADS in adulthood was statistically significant (p < 0.05. This study shows that a careful inclusion of potential confounding variables is important when assessing mediation.
Directory of Open Access Journals (Sweden)
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
Energy Technology Data Exchange (ETDEWEB)
Fhager, V
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
International Nuclear Information System (INIS)
Fhager, V.
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
DEFF Research Database (Denmark)
Madsen, Tobias
2017-01-01
In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...
Evaluation of significance of positive familial history in prevalence of hypertension in Isfahan
Directory of Open Access Journals (Sweden)
Tavassoli A
1997-09-01
Full Text Available Hypertension is one of the most important modifiable risk factors of vascular heart disease. Control of hypertension in different age groups has a significant effect upon the control and prevention of vascular heart disease. A familial pattern is observed in the distribution of blood pressure in different societies. Family history of hypertension has a profound effect on the future risk of developing hypertension. The blood pressure of approximately 8150 inhabitants of Isfahan aged above 18 years was measured during 1993-94. Blood pressure measurements were performed according to the standards set by WHO i.e., on two separate occasions, in the sitting position, and from both arms. A questionnaire was completed consisting of 26 questions, including questions regarding history of hypertension in first and second-degree relatives. Cases with a blood pressure of 140/90 mmHg or more, were referred to the Cardiovascular Research Center of Isfahan for further evaluation. Mean systolic and diastolic blood pressure was higher in cases with a positive family history of hypertension. In this study, 37.4% of the men with hypertension and 45.4% of hypertensive women had positive history of hypertension in first-degree relatives. The association between positive family history and hypertension was not significant in men (P=0.62, but it was significant in women (P=0.000. This difference was less pronounced in the older age groups, which could be explained by the illiteracy of most of the older cases and their ignorance of the existence of hypertension in family members. After correcting for the effects of confounding factors, it appears that positive family history has a stronger association with the development of hypertension in women. Moreover, positive family history is a strong prognostic factor in the likelihood of hypertension in the children of affected cases. These findings emphasize the importance of routine blood pressure measurement in children and
Statistical Analysis and Evaluation of the Depth of the Ruts on Lithuanian State Significance Roads
Directory of Open Access Journals (Sweden)
Erinijus Getautis
2011-04-01
Full Text Available The aim of this work is to gather information about the national flexible pavement roads ruts depth, to determine its statistical dispersijon index and to determine their validity for needed requirements. Analysis of scientific works of ruts apearance in the asphalt and their influence for driving is presented in this work. Dynamical models of ruts in asphalt are presented in the work as well. Experimental outcome data of rut depth dispersijon in the national highway of Lithuania Vilnius – Kaunas is prepared. Conclusions are formulated and presented. Article in Lithuanian
Directory of Open Access Journals (Sweden)
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
Statistical determination of significant curved I-girder bridge seismic response parameters
Seo, Junwon
2013-06-01
Curved steel bridges are commonly used at interchanges in transportation networks and more of these structures continue to be designed and built in the United States. Though the use of these bridges continues to increase in locations that experience high seismicity, the effects of curvature and other parameters on their seismic behaviors have been neglected in current risk assessment tools. These tools can evaluate the seismic vulnerability of a transportation network using fragility curves. One critical component of fragility curve development for curved steel bridges is the completion of sensitivity analyses that help identify influential parameters related to their seismic response. In this study, an accessible inventory of existing curved steel girder bridges located primarily in the Mid-Atlantic United States (MAUS) was used to establish statistical characteristics used as inputs for a seismic sensitivity study. Critical seismic response quantities were captured using 3D nonlinear finite element models. Influential parameters from these quantities were identified using statistical tools that incorporate experimental Plackett-Burman Design (PBD), which included Pareto optimal plots and prediction profiler techniques. The findings revealed that the potential variation in the influential parameters included number of spans, radius of curvature, maximum span length, girder spacing, and cross-frame spacing. These parameters showed varying levels of influence on the critical bridge response.
Directory of Open Access Journals (Sweden)
Osamu Watanabe
2011-05-01
Full Text Available The human visual system can acquire the statistical structures in temporal sequences of object feature changes, such as changes in shape, color, and its combination. Here we investigate whether the statistical learning for spatial position and shape changes operates separately or not. It is known that the visual system processes these two types of information separately; the spatial information is processed in the parietal cortex, whereas object shapes and colors are detected in the temporal pathway, and, after that, we perceive bound information in the two streams. We examined whether the statistical learning operates before or after binding the shape and the spatial information by using the “re-paired triplet” paradigm proposed by Turk-Browne, Isola, Scholl, and Treat (2008. The result showed that observers acquired combined sequences of shape and position changes, but no statistical information in individual sequence was obtained. This finding suggests that the visual statistical learning works after binding the temporal sequences of shapes and spatial structures and would operate in the higher-order visual system; this is consistent with recent ERP (Abla & Okanoya, 2009 and fMRI (Turk-Browne, Scholl, Chun, & Johnson, 2009 studies.
Directory of Open Access Journals (Sweden)
Jane R Allison
Full Text Available Evolutionary arms races between pathogens and their hosts may be manifested as selection for rapid evolutionary change of key genes, and are sometimes detectable through sequence-level analyses. In the case of protein-coding genes, such analyses frequently predict that specific codons are under positive selection. However, detecting positive selection can be non-trivial, and false positive predictions are a common concern in such analyses. It is therefore helpful to place such predictions within a structural and functional context. Here, we focus on the p19 protein from tombusviruses. P19 is a homodimer that sequesters siRNAs, thereby preventing the host RNAi machinery from shutting down viral infection. Sequence analysis of the p19 gene is complicated by the fact that it is constrained at the sequence level by overprinting of a viral movement protein gene. Using homology modeling, in silico mutation and molecular dynamics simulations, we assess how non-synonymous changes to two residues involved in forming the dimer interface-one invariant, and one predicted to be under positive selection-impact molecular function. Interestingly, we find that both observed variation and potential variation (where a non-synonymous change to p19 would be synonymous for the overprinted movement protein does not significantly impact protein structure or RNA binding. Consequently, while several methods identify residues at the dimer interface as being under positive selection, MD results suggest they are functionally indistinguishable from a site that is free to vary. Our analyses serve as a caveat to using sequence-level analyses in isolation to detect and assess positive selection, and emphasize the importance of also accounting for how non-synonymous changes impact structure and function.
Directory of Open Access Journals (Sweden)
Kuroda Takeshi
2012-09-01
Full Text Available Abstract Background The kidney is a major target organ for systemic amyloidosis that often affects the kidney including proteinura, and elevated serum creatinine (Cr. The correlation between amount of amyloid deposits and clinical parameters is not known. The aim of this study was to clarify correlation the amyloid area in all renal biopsy specimen and clinical parameters. Methods Fifty-eight patients with an established diagnosis of AL amyloidosis participated in the study. All patients showed amyloid deposits in renal biopsies. We retrospectively investigated the correlation between clinical data and amyloid occupied area in whole renal biopsy specimens. Results The area occupied by amyloid was less than 10% in 57 of the 58 patients, and was under 2% in 40. For statistical analyses, %amyloid-positive areas were transformed to common logarithmic values (Log10%amyloid. Cr showed significant correlation with Log10%amyloid and estimated glomerular filtration rate (eGFR showed the significant negative correlation. Patient age, cleatinine clearance (Ccr, blood urea nitorogen, and urinary protein was not significantly correlated with Log10%amyloid. The correlation with other clinical factors such as sex, and serum concentrations of total protein, albumin, immunoglobulins, compliments was evaluated. None of these factors significantly correlated with Log10%amyloid. According to sex- and age- adjusted multiple linear regression analysis, Log10%amyloid had significant positive association with Cr and significant negative association with eGFR. Conclusion There is significant association between amyloid-positive area in renal tissue and renal function, especially Cr and eGFR. The level of Cr and eGFR may be a marker of amount of amyloid in renal tissue.
Reporting instructions significantly impact false positive rates when reading chest radiographs
Energy Technology Data Exchange (ETDEWEB)
Robinson, John W.; Brennan, Patrick C.; Mello-Thoms, Claudia; Lewis, Sarah J. [The University of Sydney, Medical Image Optimisation and Perception Group, Discipline of Medical Radiation Sciences, Faculty of Health Sciences, Lidcombe, NSW (Australia)
2016-10-15
To determine the impact of specific reporting tasks on the performance of radiologists when reading chest radiographs. Ten experienced radiologists read a set of 40 postero-anterior (PA) chest radiographs: 21 nodule free and 19 with a proven solitary nodule. There were two reporting conditions: an unframed task (UFT) to report any abnormality and a framed task (FT) reporting only lung nodule/s. Jackknife free-response operating characteristic (JAFROC) figure of merit (FOM), specificity, location sensitivity and number of true positive (TP), false positive (FP), true negative (TN) and false negative (FN) decisions were used for analysis. JAFROC FOM for tasks showed a significant reduction in performance for framed tasks (P = 0.006) and an associated decrease in specificity (P = 0.011) but no alteration to the location sensitivity score. There was a significant increase in number of FP decisions made during framed versus unframed tasks for nodule-containing (P = 0.005) and nodule-free (P = 0.011) chest radiographs. No significant differences in TP were recorded. Radiologists report more FP decisions when given specific reporting instructions to search for nodules on chest radiographs. The relevance of clinical history supplied to radiologists is called into question and may induce a negative effect. (orig.)
Reporting instructions significantly impact false positive rates when reading chest radiographs
International Nuclear Information System (INIS)
Robinson, John W.; Brennan, Patrick C.; Mello-Thoms, Claudia; Lewis, Sarah J.
2016-01-01
To determine the impact of specific reporting tasks on the performance of radiologists when reading chest radiographs. Ten experienced radiologists read a set of 40 postero-anterior (PA) chest radiographs: 21 nodule free and 19 with a proven solitary nodule. There were two reporting conditions: an unframed task (UFT) to report any abnormality and a framed task (FT) reporting only lung nodule/s. Jackknife free-response operating characteristic (JAFROC) figure of merit (FOM), specificity, location sensitivity and number of true positive (TP), false positive (FP), true negative (TN) and false negative (FN) decisions were used for analysis. JAFROC FOM for tasks showed a significant reduction in performance for framed tasks (P = 0.006) and an associated decrease in specificity (P = 0.011) but no alteration to the location sensitivity score. There was a significant increase in number of FP decisions made during framed versus unframed tasks for nodule-containing (P = 0.005) and nodule-free (P = 0.011) chest radiographs. No significant differences in TP were recorded. Radiologists report more FP decisions when given specific reporting instructions to search for nodules on chest radiographs. The relevance of clinical history supplied to radiologists is called into question and may induce a negative effect. (orig.)
The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data
Gregory, Kelvin
2006-01-01
The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…
Holtzman, Jessica N; Miller, Shefali; Hooshmand, Farnaz; Wang, Po W; Chang, Kiki D; Hill, Shelley J; Rasgon, Natalie L; Ketter, Terence A
2015-07-01
The strengths and limitations of considering childhood-and adolescent-onset bipolar disorder (BD) separately versus together remain to be established. We assessed this issue. BD patients referred to the Stanford Bipolar Disorder Clinic during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD Affective Disorders Evaluation. Patients with childhood- and adolescent-onset were compared to those with adult-onset for 7 unfavorable bipolar illness characteristics with replicated associations with early-onset patients. Among 502 BD outpatients, those with childhood- (adolescent- (13-18 years, N=218) onset had significantly higher rates for 4/7 unfavorable illness characteristics, including lifetime comorbid anxiety disorder, at least ten lifetime mood episodes, lifetime alcohol use disorder, and prior suicide attempt, than those with adult-onset (>18 years, N=174). Childhood- but not adolescent-onset BD patients also had significantly higher rates of first-degree relative with mood disorder, lifetime substance use disorder, and rapid cycling in the prior year. Patients with pooled childhood/adolescent - compared to adult-onset had significantly higher rates for 5/7 of these unfavorable illness characteristics, while patients with childhood- compared to adolescent-onset had significantly higher rates for 4/7 of these unfavorable illness characteristics. Caucasian, insured, suburban, low substance abuse, American specialty clinic-referred sample limits generalizability. Onset age is based on retrospective recall. Childhood- compared to adolescent-onset BD was more robustly related to unfavorable bipolar illness characteristics, so pooling these groups attenuated such relationships. Further study is warranted to determine the extent to which adolescent-onset BD represents an intermediate phenotype between childhood- and adult-onset BD. Copyright © 2015 Elsevier B.V. All rights reserved.
Massey, J. L.
1976-01-01
The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.
Extra-nodal extension is a significant prognostic factor in lymph node positive breast cancer.
Directory of Open Access Journals (Sweden)
Sura Aziz
Full Text Available Presence of lymph node (LN metastasis is a strong prognostic factor in breast cancer, whereas the importance of extra-nodal extension and other nodal tumor features have not yet been fully recognized. Here, we examined microscopic features of lymph node metastases and their prognostic value in a population-based cohort of node positive breast cancer (n = 218, as part of the prospective Norwegian Breast Cancer Screening Program NBCSP (1996-2009. Sections were reviewed for the largest metastatic tumor diameter (TD-MET, nodal afferent and efferent vascular invasion (AVI and EVI, extra-nodal extension (ENE, number of ENE foci, as well as circumferential (CD-ENE and perpendicular (PD-ENE diameter of extra-nodal growth. Number of positive lymph nodes, EVI, and PD-ENE were significantly increased with larger primary tumor (PT diameter. Univariate survival analysis showed that several features of nodal metastases were associated with disease-free (DFS or breast cancer specific survival (BCSS. Multivariate analysis demonstrated an independent prognostic value of PD-ENE (with 3 mm as cut-off value in predicting DFS and BCSS, along with number of positive nodes and histologic grade of the primary tumor (for DFS: P = 0.01, P = 0.02, P = 0.01, respectively; for BCSS: P = 0.02, P = 0.008, P = 0.02, respectively. To conclude, the extent of ENE by its perpendicular diameter was independently prognostic and should be considered in line with nodal tumor burden in treatment decisions of node positive breast cancer.
Jeon, Eun-Ju; Park, Yong-Soo; Park, Shi-Nae; Park, Kyoung-Ho; Kim, Dong-Hyun; Nam, In-Chul; Chang, Ki-Hong
2013-01-01
Orthostatic dizziness (OD) and positional dizziness (PD) are considerably common conditions in dizziness clinic, whereas those two conditions are not clearly separated. We aimed to evaluate the clinical significance of simple OD and OD combined with PD for the diagnosis of benign paroxysmal positional vertigo (BPPV) and orthostatic intolerance (OI). Patients presenting with OD (n=102) were divided into two groups according to their symptoms: group PO, presenting with PD as well as OD; group O, presenting with OD. A thorough medical history, physical examination, and vestibular function tests were performed to identify the etiology of the dizziness. Orthostatic vital sign measurement (OVSM) was used to diagnose OI. The majority of patients were in group PO (87.3%). BPPV was the most common cause of OD for entire patients (36.3%) and group PO (37.1%), while OI was most common etiology for group O (38.5%). Total of 17 (16.7%) OI patients were identified by OVSM test. Orthostatic hypotension (n=10) was most frequently found, followed by orthostatic hypertension (n=5), and orthostatic tachycardia (n=2). Group O showed significantly higher percentage (38.5%) of OI than group PO (13.5%) (P=0.039). It is suggested that orthostatic testing such as OVSM or head-up tilt table test should be performed as an initial work up for the patients with simple OD. Positional tests for BPPV should be considered as an essential diagnostic test for patients with OD, even though their dizziness is not associated with PD. Copyright © 2013 Elsevier Inc. All rights reserved.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Saha, Ranajit; Pan, Sudip; Chattaraj, Pratim K
2016-11-05
The validity of the maximum hardness principle (MHP) is tested in the cases of 50 chemical reactions, most of which are organic in nature and exhibit anomeric effect. To explore the effect of the level of theory on the validity of MHP in an exothermic reaction, B3LYP/6-311++G(2df,3pd) and LC-BLYP/6-311++G(2df,3pd) (def2-QZVP for iodine and mercury) levels are employed. Different approximations like the geometric mean of hardness and combined hardness are considered in case there are multiple reactants and/or products. It is observed that, based on the geometric mean of hardness, while 82% of the studied reactions obey the MHP at the B3LYP level, 84% of the reactions follow this rule at the LC-BLYP level. Most of the reactions possess the hardest species on the product side. A 50% null hypothesis is rejected at a 1% level of significance.
International Nuclear Information System (INIS)
Park, Myung Hwan; Seo, Jeong Min; Choi, Byeong Gi; Shin, Eun Hyeok; Song, Gi Won
2011-01-01
This study statistically analyzed the difference of the stability of maintaining a respiratory period shown according to position and use of a device to search the tendency and usefulness of a device. The study obtained respiratory signals which maintained a respiratory period for 20 minutes each supine and prone position for 11 subjects. The study obtained respiratory signals in a state of using a belly board for 7 patients in a bad condition of a respiratory period in a prone position to analyze a change in respiration and the stability before and after the use of a device. The supine part showed 54.5%, better than the prone part of 36.4% in a case that the stability for maintaining a respiratory period was in a good condition as a fixed respiratory period was well maintained according to the position. 6 patients (85%) showed a maintenance pattern of a respiratory period significantly different before the use and 4 patients showed a significantly good change in the stability for maintaining a respiratory period as a result that belly boards were used for 7 patients that the maintenance of a respiratory period was not in a good condition on a prone position. It seemed that this study could contribute to the maintenance of respiratory period and of respiratory stability as the optimal position for maintenance of respiration and the use of a device such as a belly board were decided through statistic analysis of respiratory signals and its application even if patient position and use of device were decided by the beam arrangement a treatment part of a patient, location of a target, and an expected plan.
Okazaki, Hisanori; Ishimura, Eiji; Okuno, Senji; Norimine, Kyoko; Yamakawa, Kenjiro; Yamakawa, Tomoyuki; Shoji, Shigeichi; Nishizawa, Yoshiki; Inaba, Masaaki
2013-01-01
Serum magnesium (Mg) levels have been associated with muscle performance in the general population. We hypothesized that serum Mg would be associated with muscle quality in hemodialysis patients. A total of 310 patients were examined (age: 58 ± 12 years, hemodialysis duration: 6.4 ± 6.0 years, 60.6% men, and 36.1% diabetics). Arm lean mass was measured by dual energy X-ray absorptiometry (DXA) on the dominant side. Arm muscle quality was defined as the ratio of the handgrip strength to the arm lean mass of the same side (kg/kg). Serum Mg was 1.15 ± 0.16 mmol/L (2.8 ± 0.4 mg/dL), being higher than the reference range of normal subjects. There was a significant negative correlation between muscle quality and age (r = -0.326, p<0.0001) and duration of hemodialysis (r = -0.253, p<0.0001). The muscle quality of the diabetics was significantly lower than that of the non-diabetics (p<0.001). There was a significant, positive correlation between muscle quality and serum Mg (r = 0.118, p<0.05), but not serum calcium or phosphate. In multiple regression analysis, age, gender, hemodialysis duration, diabetes, and serum Mg (β = 0.129, p<0.05) were significantly and independently associated with muscle quality (R(2) = 0.298, p<0.0001). These results demonstrated that a lower serum Mg concentration was significantly associated with poor muscle quality in hemodialysis patients. Further studies are needed to explore the mechanism by which lower serum Mg affects muscle quality.
Kim, Kyungmi; Johnson, Marcia K
2015-04-01
Well-being and subjective experience of a coherent world depend on our sense of 'self' and relations between the self and the environment (e.g. people, objects and ideas). The ventromedial prefrontal cortex (vMPFC) is involved in self-related processing, and disrupted vMPFC activity is associated with disruptions of emotional/social functioning (e.g. depression and autism). Clarifying precise function(s) of vMPFC in self-related processing is an area of active investigation. In this study, we sought to more specifically characterize the function of vMPFC in self-related processing, focusing on two alternative accounts: (i) assignment of positive subjective value to self-related information and (ii) assignment of personal significance to self-related information. During functional magnetic resonance imaging (fMRI), participants imagined owning objects associated with either their perceived ingroup or outgroup. We found that for ingroup-associated objects, vMPFC showed greater activity for objects with increased than decreased post-ownership preference. In contrast, for outgroup-associated objects, vMPFC showed greater activity for objects with decreased than increased post-ownership preference. Our findings support the idea that the function of vMPFC in self-related processing may not be to represent/evaluate the 'positivity' or absolute preference of self-related information but to assign personal significance to it based on its meaning/function for the self. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Hashim, Muhammad Jawad
2010-09-01
Post-hoc secondary data analysis with no prespecified hypotheses has been discouraged by textbook authors and journal editors alike. Unfortunately no single term describes this phenomenon succinctly. I would like to coin the term "sigsearch" to define this practice and bring it within the teaching lexicon of statistics courses. Sigsearch would include any unplanned, post-hoc search for statistical significance using multiple comparisons of subgroups. It would also include data analysis with outcomes other than the prespecified primary outcome measure of a study as well as secondary data analyses of earlier research.
Simulation of statistical systems with not necessarily real and positive probabilities
International Nuclear Information System (INIS)
Kalkreuter, T.
1991-01-01
A new method to determine expectation values of observables in statistical systems with not necessarily real and positive probabilities is proposed. It is tested in a numerical study of the two-dimensional O(3)-symmetric nonlinear σ-model with Symanzik's one-loop improved lattice action. This model is simulated as polymer system with field dependent activities which can be made positive definite or indefinite by adjusting additive constants of the action. For a system with indefinite activities the new proposal is found to work. It is also verified that local observables are not affected by far-away ploymers with indefinite activities when the system has no long-range order. (orig.)
DEFF Research Database (Denmark)
Serviss, Jason T.; Gådin, Jesper R.; Eriksson, Per
2017-01-01
, e.g. genes in a specific pathway, alone can separate samples into these established classes. Despite this, the evaluation of class separations is often subjective and performed via visualization. Here we present the ClusterSignificance package; a set of tools designed to assess the statistical...... significance of class separations downstream of dimensionality reduction algorithms. In addition, we demonstrate the design and utility of the ClusterSignificance package and utilize it to determine the importance of long non-coding RNA expression in the identity of multiple hematological malignancies....
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Directory of Open Access Journals (Sweden)
Pawel Miotla
2017-01-01
Full Text Available Aim. Urinary tract infection (UTI is considered one of the most common bacterial infections in women. The aim of this study was to investigate the types of uropathogens present, as well as the degree of antimicrobial drug resistance seen among premenopausal (n=2748 and postmenopausal (n=1705 women with uncomplicated UTI. Methods. Urinary samples (n=4453 collected from women with UTI were analyzed in terms of uropathogens present. These were considered as positive if bacterial growth was ≥105 colony forming units (CFUs/mL. Susceptibility and resistance testing for commonly used antibiotics was subsequently assessed. Results. The most common uropathogens cultured from urine samples were Escherichia coli (65.5%, followed by Enterococcus faecalis (12.2%, Klebsiella pneumoniae (4.7%, and Proteus mirabilis (4.2%. The resistance to ampicillin exceeded 40%, independently of menopausal status. Of note, resistance to ciprofloxacin exceeded 25% among postmenopausal patients. Moreover, resistance of all uropathogens to commonly used antimicrobials was significantly higher in postmenopausal women. Conclusion. Due to the high resistance rate, ampicillin, ciprofloxacin, and the trimethoprim/sulfamethoxazole combination should be avoided in treating postmenopausal women affected by UTI without being indicated by initial urine culture report. Finally, cephalexin and cefuroxime are promising alternatives as initial treatment in postmenopausal women.
Significance and management of positive surgical margins at the time of radical prostatectomy.
Silberstein, Jonathan L; Eastham, James A
2014-10-01
Positive surgical margins (PSM) at the time of radical prostatectomy (RP) result in an increased risk of biochemical recurrence (BCR) and secondary treatment. We review current literature with a focus on stratifying the characteristics of the PSM that may define its significance, the impact of modern imaging and surgical approaches in avoidance of PSM, and management strategies when PSM do occur. We performed a review of the available literature to identify factors associated with PSM and their management. PSM have been repeatedly demonstrated to be associated with an increased risk of BCR following RP. The specific characteristics (size, number, location, Gleason score at the margin) of the PSM may influence the risk of recurrence. Novel imaging and surgical approaches are being investigated and may allow for reductions of PSM in the future. The use of adjuvant treatment for a PSM remains controversial and should be decided on an individual basis after a discussion about the risks and benefits. The goal of RP is complete resection of the tumor. PSM are associated with increased risk of BCR and secondary treatments. Of the risk factors associated with BCR after RP, a PSM is directly influenced by surgical technique.
Significance and management of positive surgical margins at the time of radical prostatectomy
Directory of Open Access Journals (Sweden)
Jonathan L. Silberstein
2014-01-01
Full Text Available Positive surgical margins (PSM at the time of radical prostatectomy (RP result in an increased risk of biochemical recurrence (BCR and secondary treatment. We review current literature with a focus on stratifying the characteristics of the PSM that may define its significance, the impact of modern imaging and surgical approaches in avoidance of PSM, and management strategies when PSM do occur. We performed a review of the available literature to identify factors associated with PSM and their management. PSM have been repeatedly demonstrated to be associated with an increased risk of BCR following RP. The specific characteristics (size, number, location, Gleason score at the margin of the PSM may influence the risk of recurrence. Novel imaging and surgical approaches are being investigated and may allow for reductions of PSM in the future. The use of adjuvant treatment for a PSM remains controversial and should be decided on an individual basis after a discussion about the risks and benefits. The goal of RP is complete resection of the tumor. PSM are associated with increased risk of BCR and secondary treatments. Of the risk factors associated with BCR after RP, a PSM is directly influenced by surgical technique.
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
The Health Significance of Positive Emotions in Adulthood and Later Life
Ong, Anthony D.; Mroczek, Daniel K.; Riffin, Catherine
2011-01-01
A growing body of literature supports a link between positive emotions and health in older adults. In this article, we review evidence of the effects of positive emotions on downstream biological processes and meaningful clinical endpoints, such as adult morbidity and mortality. We then present relevant predictions from lifespan theories that suggest changes in cognition and motivation may play an important role in explaining how positive emotions are well maintained in old age, despite perva...
Kim, Sung-Min; Choi, Yosoon
2017-06-18
To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
Directory of Open Access Journals (Sweden)
Sung-Min Kim
2017-06-01
Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
Schoolwide Positive Behavior Supports and Students with Significant Disabilities: Where Are We?
Kurth, Jennifer A.; Enyart, Matt
2016-01-01
Although the number of schools implementing schoolwide positive behavior supports (SWPBS) has increased dramatically, the inclusion of students with severe disabilities in these efforts remains negligible. This article describes the evolution of positive behavior intervention and supports into the SWPBS approach used in many schools today,…
Yokoyama, Shozo; Takenaka, Naomi
2005-04-01
Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Directory of Open Access Journals (Sweden)
E. A. Tatokchin
2017-01-01
Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
Weighted A-statistical convergence for sequences of positive linear operators.
Mohiuddine, S A; Alotaibi, Abdullah; Hazarika, Bipan
2014-01-01
We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Leitner Dietmar
2005-04-01
Full Text Available Abstract Background A reliable prediction of the Xaa-Pro peptide bond conformation would be a useful tool for many protein structure calculation methods. We have analyzed the Protein Data Bank and show that the combined use of sequential and structural information has a predictive value for the assessment of the cis versus trans peptide bond conformation of Xaa-Pro within proteins. For the analysis of the data sets different statistical methods such as the calculation of the Chou-Fasman parameters and occurrence matrices were used. Furthermore we analyzed the relationship between the relative solvent accessibility and the relative occurrence of prolines in the cis and in the trans conformation. Results One of the main results of the statistical investigations is the ranking of the secondary structure and sequence information with respect to the prediction of the Xaa-Pro peptide bond conformation. We observed a significant impact of secondary structure information on the occurrence of the Xaa-Pro peptide bond conformation, while the sequence information of amino acids neighboring proline is of little predictive value for the conformation of this bond. Conclusion In this work, we present an extensive analysis of the occurrence of the cis and trans proline conformation in proteins. Based on the data set, we derived patterns and rules for a possible prediction of the proline conformation. Upon adoption of the Chou-Fasman parameters, we are able to derive statistically relevant correlations between the secondary structure of amino acid fragments and the Xaa-Pro peptide bond conformation.
Investor Outlook: Significance of the Positive LCA2 Gene Therapy Phase III Results.
Schimmer, Joshua; Breazzano, Steven
2015-12-01
Spark Therapeutics recently reported positive phase III results for SPK-RPE65 targeting the treatment of visual impairment caused by RPE65 gene mutations (often referred to as Leber congenital amaurosis type 2, or LCA2, but may include other retinal disorders), marking an important inflection point for the field of gene therapy. The results highlight the ability to successfully design and execute a randomized trial of a gene therapy and also reinforce the potentially predictive nature of early preclinical and clinical data. The results are expected to pave the way for the first approved gene therapy product in the United States and should sustain investor interest and confidence in gene therapy for many approaches, including retina targeting and beyond.
Emde, B.; Huse, M.; Hermsdorf, J.; Kaierle, S.; Wesling, V.; Overmeyer, L.; Kozakov, R.; Uhrlandt, D.
As an energy-preserving variant of laser hybrid welding, laser-assisted arc welding uses laser powers of less than 1 kW. Recent studies have shown that the electrical conductivity of a TIG welding arc changes within the arc in case of a resonant interaction between laser radiation and argon atoms. This paper presents investigations on how to control the position of the arc root on the workpiece by means of the resonant interaction. Furthermore, the influence on the welding result is demonstrated. The welding tests were carried out on a cooled copper plate and steel samples with resonant and non-resonant laser radiation. Moreover, an analysis of the weld seam is presented.
Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf
2015-03-01
We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.
Malietzis, George; Lee, Gui Han; Bernardo, David; Blakemore, Alexandra I F; Knight, Stella C; Moorghen, Morgan; Al-Hassi, Hafid O; Jenkins, John T
2015-07-01
The host local immune response (LIR) to cancer is a determinant of cancer outcome. Regulation of this local response is largely achieved through chemokine synthesis from the tumor microenvironment such as C-Chemokine-Receptor-7 (CCR7). We examined the LIR measured as CCR7 expression, in colorectal cancers (CRC) and explored relationships with body composition (BC) and survival. A study of paraffin-embedded tissue specimens was carried out in 116 patients with non-metastatic CRC. CCR7 expression was determined by immunohistochemistry. Analysis of computer tomography scans was used to calculate BC parameters. Survival analyses and multivariate regression models were used. High CCR7(+) cell density within the tumor stroma and at the margin was significantly associated with increased age, the presence of lymphovascular invasion, higher tumor stage, lymph node metastasis, high Klintrup-Makinen immune score, and myosteatosis. High CCR7(+) cell density in the tumor margin was significantly associated with shorter disease-free (DFS) and overall survival (OS) (P < 0.001). This was also significantly associated with shorter survival in multivariate analysis (HR = 8.87; 95%CI [2.51-31.3]; P < 0.01 for OS and HR = 4.72; 95%CI (1.24-12.9); P = 0.02 for DFS). Our results suggest that a specific immune microenvironment may be associated with altered host's BC and tumor behavior, and that CCR7 may serve as a novel prognostic biomarker. © 2015 Wiley Periodicals, Inc.
Weighted A-Statistical Convergence for Sequences of Positive Linear Operators
Directory of Open Access Journals (Sweden)
S. A. Mohiuddine
2014-01-01
Full Text Available We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.
Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.
1999-01-01
Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier
Ensuring Positiveness of the Scaled Difference Chi-square Test Statistic.
Satorra, Albert; Bentler, Peter M
2010-06-01
A scaled difference test statistic [Formula: see text] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (2001). The statistic [Formula: see text] is asymptotically equivalent to the scaled difference test statistic T̄(d) introduced in Satorra (2000), which requires more involved computations beyond standard output of SEM software. The test statistic [Formula: see text] has been widely used in practice, but in some applications it is negative due to negativity of its associated scaling correction. Using the implicit function theorem, this note develops an improved scaling correction leading to a new scaled difference statistic T̄(d) that avoids negative chi-square values.
International Nuclear Information System (INIS)
Hennig, E.M.; Norwegian Radium Hospital, Oslo; Kvinnsland, S.; Holm, R.; Nesland, J.M.
1999-01-01
Human papillomavirus (HPV) 16 has previously been found in 19/41 breast carcinomas (46%) in women with a history of HPV 16 positive CIN III lesions. There was no significant difference in distribution of histological subtypes, mean or median tumour diameter or number of regional lymph node metastases in the HPV positive and HPV negative breast carcinoma groups. P53, p21 and c-erbB-2 proteins were analyzed by immunohistochemistry in the HPV 16 positive and HPV negative breast carcinomas. There was a significant difference in p53 and p21 protein immunoreactivity between HPV 16 positive and HPV negative breast carcinomas (p=0.0091 and p=0.0040), with a significant less detectable p53 and p21 protein immunoreactivity in the HPV 16 positive cases. There was also a significant difference in the coexpression of p53/p21 between the HPV 16 positive and HPV 16 negative breast carcinomas (p=0.002). No significant difference in immunostaining for c-erbB-2 protein in the two groups was found (p=0.15), or for the coexpression of p53/c-erbB-2 (p=0.19). The significantly lower expression of p53 and p21 proteins in HPV 16 positive than in HPV 16 negative breast carcinomas supports the hypothesis of inactivation and degradation of wild-type p53 proteins by HPV 16 E6 and that p53 mutation is not necessary for transformation in the HPV 16 positive cases. (orig.)
Directory of Open Access Journals (Sweden)
Lutz Bornmann
Full Text Available Using the InCites tool of Thomson Reuters, this study compares normalized citation impact values calculated for China, Japan, France, Germany, United States, and the UK throughout the time period from 1981 to 2010. InCites offers a unique opportunity to study the normalized citation impacts of countries using (i a long publication window (1981 to 2010, (ii a differentiation in (broad or more narrow subject areas, and (iii allowing for the use of statistical procedures in order to obtain an insightful investigation of national citation trends across the years. Using four broad categories, our results show significantly increasing trends in citation impact values for France, the UK, and especially Germany across the last thirty years in all areas. The citation impact of papers from China is still at a relatively low level (mostly below the world average, but the country follows an increasing trend line. The USA exhibits a stable pattern of high citation impact values across the years. With small impact differences between the publication years, the US trend is increasing in engineering and technology but decreasing in medical and health sciences as well as in agricultural sciences. Similar to the USA, Japan follows increasing as well as decreasing trends in different subject areas, but the variability across the years is small. In most of the years, papers from Japan perform below or approximately at the world average in each subject area.
International Nuclear Information System (INIS)
Shakespeare, T.P.; Mukherjee, R.K.; Gebski, V.J.
2003-01-01
Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any
International Nuclear Information System (INIS)
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-01-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-06-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
International Nuclear Information System (INIS)
Zhukhlistov, A.A.; Avilov, A.S.; Ferraris, D.; Zvyagin, B.B.; Plotnikov, V.P.
1997-01-01
The method of improved automatic electron diffractometry for measuring and recording intensities to two-dimensionally distributed reflections of texture-type electron diffraction patterns has been used for the analysis of the brucite Mg(OH) 2 structure. The experimental accuracy of the measured intensities proved to be sufficient for studying fine structural details of the statistical distribution of hydrogen atoms over three structure positions located around the threefold axis of the brucite structure
Directory of Open Access Journals (Sweden)
Manisha Jain
2009-01-01
Full Text Available Background: The presence of anti-HBc IgG in the absence of HBsAg is usually indicative of a past self-limiting HBV infection. But it is frequently associated with co-infection with HCV which can worsen the existing status of chronic liver disease (CLD. Objectives: The present study was planned to evaluate the significance of isolated HBc IgG positivity in patients of CLD and look for the presence of HCV co-infection in such patients. Methods: Clinical profiles and biochemical tests were done for all the 77 CLD cases included in the study. Blood samples were taken from these patients and tested by the commercially available EIA for the presence of HBsAg, anti-HBc IgG, anti-HBs and anti-HCV. HBV DNA was detected by amplifying the surface region in all the cases. Results: Isolated anti-HBc IgG positivity defined as the presence of anti-HBc IgG in absence of any other serological markers of HBV infection was detected in 28 patients . Out of 64 patients positive for anti-HBc IgG 36 had the markers of HBV, either HBsAg, HBV DNA or anti-HBs alone or in combination. There was a significant association between isolated anti-HBc IgG positivity and HCV co-infection. Conclusion: Anti-HBc IgG should be tested in all patients with CLD as it is frequently the only marker of HBV infection in such patients and they should be monitored closely as such patients can develop CLD. Presence of co-infection with HCV should be actively searched for in such patients.
Diagnostic and statistical manual-5: Position paper of the Indian Psychiatric Society
Jacob, K. S.; Kallivayalil, R. A.; Mallik, A. K.; Gupta, N.; Trivedi, J. K.; Gangadhar, B. N.; Praveenlal, K.; Vahia, V.; Rao, T. S. Sathyanarayana
2013-01-01
The development of the Diagnostic and Statistical Manual-5 (DSM-5) has been an exhaustive and elaborate exercise involving the review of DSM-IV categories, identifying new evidence and ideas, field testing, and revising issues in order that it is based on the best available evidence. This report of the Task Force of the Indian Psychiatric Society examines the current draft of the DSM-5 and discusses the implications from an Indian perspective. It highlights the issues related to the use of universal categories applied across diverse cultures. It reiterates the evidence for mental disorders commonly seen in India. It emphasizes the need for caution when clinical categories useful to specialists are employed in the contexts of primary care and in community settings. While the DSM-5 is essentially for the membership of the American Psychiatric Association, its impact will be felt far beyond the boundaries of psychiatry and that of the United States of America. However, its atheoretical approach, despite its pretensions, pushes a purely biomedical agenda to the exclusion of other approaches to mental health and illness. Nevertheless, the DSM-5 should serve a gate-keeping function, which intends to set minimum standards. It is work in progress and will continue to evolve with the generation of new evidence. For the DSM-5 to be relevant and useful across the cultures and countries, it needs to be broad-based and consider social and cultural contexts, issues, and phenomena. The convergence and compatibility with International Classification of Diseases-11 is a worthy goal. While the phenomenal effort of the DSM-5 revision is commendable, psychiatry should continue to strive for a more holistic understanding of mental health, illness, and disease. PMID:23441009
Directory of Open Access Journals (Sweden)
Justin London
2010-01-01
Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”
Yang, Zheng Rong; Bullifent, Helen L.; Moore, Karen; Paszkiewicz, Konrad; Saint, Richard J.; Southern, Stephanie J.; Champion, Olivia L.; Senior, Nicola J.; Sarkar-Tyson, Mitali; Oyston, Petra C. F.; Atkins, Timothy P.; Titball, Richard W.
2017-02-01
Massively parallel sequencing technology coupled with saturation mutagenesis has provided new and global insights into gene functions and roles. At a simplistic level, the frequency of mutations within genes can indicate the degree of essentiality. However, this approach neglects to take account of the positional significance of mutations - the function of a gene is less likely to be disrupted by a mutation close to the distal ends. Therefore, a systematic bioinformatics approach to improve the reliability of essential gene identification is desirable. We report here a parametric model which introduces a novel mutation feature together with a noise trimming approach to predict the biological significance of Tn5 mutations. We show improved performance of essential gene prediction in the bacterium Yersinia pestis, the causative agent of plague. This method would have broad applicability to other organisms and to the identification of genes which are essential for competitiveness or survival under a broad range of stresses.
Zhang, Xuemei; Yang, Xiaoyan; Wang, Hongbin; Li, Qinchun; Wang, Haijuan; Li, Yanyan
2017-04-01
A pot experiment was conducted to compare the content of endogenous trans-zeatin (Z), plant arsenic (As) uptake and physiological indices in the fronds of As-hyperaccumulator (Pteris cretica var. nervosa) and non-hyperaccumulator (Pteris ensiformis). Furthermore, a stepwise regression method was used to study the relationship among determined indices, and the time-course effect of main indices was also investigated under 100mg/kg As stress with time extension. In the 100-200mg/kg As treatments, plant height showed no significant difference and endogenous Z content significantly increased in P. cretica var. nervosa compared to the control, but a significant decrease of height and endogenous Z was observed in P. ensiformis. The concentrations of As (III) and As (V) increased significantly in the fronds of two plants, but this increase was much higher in P. cretica var. nervosa. Compared to the control, the contents of chlorophyll and soluble protein were significantly increased in P. cretica var. nervosa but decreased in P. ensiformis in the 200mg/kg As treatment, respectively. A significant positive correlation was found between the contents of endogenous Z and total As in P. cretica var. nervosa, but such a correlation was not found in P. ensiformis. Additionally, in the time-course effect experiment, a peak value of each index was appeared in the 43rd day in two plants, except for chlorophyll in P. ensiformis, but this value was significantly higher in P. cretica var. nervosa than that in P. ensiformis. In conclusion, a higher endogenous Z content contributed to As accumulation of P. cretica var. nervosa under As stress. Copyright © 2016 Elsevier Inc. All rights reserved.
Taberna, Miren; Inglehart, Ronald C; Pickard, Robert K L; Fakhry, Carole; Agrawal, Amit; Katz, Mira L; Gillison, Maura L
2017-04-01
Sexual behavior and oral human papillomavirus (HPV) infection are risk factors for oral squamous cell carcinoma (OSCC). The effects of OSCC diagnosis and treatment on subsequent relationship stress and sexual behavior are unknown. Incident cases of HPV-positive or HPV-negative OSCC in patients who had a partnered relationship and partners of patients with oropharyngeal cancer were eligible for a study in which surveys were administered at diagnosis and at the 6-month follow-up time point to assess relationship distress, HPV transmission and concerns about health consequences, and sexual behavior. The frequency distributions of responses, stratified by tumor HPV status, were compared at baseline and follow-up. In total, 262 patients with OSCC and 81 partners were enrolled. Among the patients, 142 (54.2%) had HPV-positive OSCC, and 120 (45.8%) had HPV-negative OSCC. Relationship distress was infrequently reported, and 69% of patients felt that their relationship had strengthened since the cancer diagnosis. Both HPV-positive patients (25%) and their partners (14%) reported feelings of guilt or responsibility for the diagnosis of an HPV-caused cancer. Concern over sexual, but not nonsexual, HPV transmission to partners was reported by 50%. Significant declines in the frequency of vaginal and oral sexual behaviors were reported at follow-up, regardless of tumor HPV status. From baseline to 6 months, significant increases in abstinence from vaginal sex (from 10% to 34%; P oral sex (from 25% to 80%; P oral sex, regardless of tumor HPV status. Sexual behavior is an important quality-of-life outcome to assess within clinical trials. [See related editorial on pages 000-000, this issue.] Cancer 2017. © 2017 American Cancer Society. Cancer 2017;123:1156-1165. © 2016 American Cancer Society. © 2017 American Cancer Society.
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values
Directory of Open Access Journals (Sweden)
Anita Lindmark
Full Text Available When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance.The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method.Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252 and high specificity (0.991. There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence.The study emphasizes the importance of combining clinical relevance and level of statistical
Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie
2016-01-01
When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. The study emphasizes the importance of combining clinical relevance and level of statistical confidence when
Haagmans, G. G.; Verhagen, S.; Voûte, R. L.; Verbree, E.
2017-09-01
Since GPS tends to fail for indoor positioning purposes, alternative methods like indoor positioning systems (IPS) based on Bluetooth low energy (BLE) are developing rapidly. Generally, IPS are deployed in environments covered with obstacles such as furniture, walls, people and electronics influencing the signal propagation. The major factor influencing the system performance and to acquire optimal positioning results is the geometry of the beacons. The geometry of the beacons is limited to the available infrastructure that can be deployed (number of beacons, basestations and tags), which leads to the following challenge: Given a limited number of beacons, where should they be placed in a specified indoor environment, such that the geometry contributes to optimal positioning results? This paper aims to propose a statistical model that is able to select the optimal configuration that satisfies the user requirements in terms of precision. The model requires the definition of a chosen 3D space (in our case 7 × 10 × 6 meter), number of beacons, possible user tag locations and a performance threshold (e.g. required precision). For any given set of beacon and receiver locations, the precision, internal- and external reliability can be determined on forehand. As validation, the modeled precision has been compared with observed precision results. The measurements have been performed with an IPS of BlooLoc at a chosen set of user tag locations for a given geometric configuration. Eventually, the model is able to select the optimal geometric configuration out of millions of possible configurations based on a performance threshold (e.g. required precision).
International Nuclear Information System (INIS)
Nikolics, K.; Szonyi, E.; Ramachandran, J.
1988-01-01
Photoreactive derivatives of GnRH and its analogues were prepared by incorporation of the 2-nitro-4(5)-azidophenylsulfenyl [2,4(5)-NAPS] group into amino acid residues at position 1, 3, 6, or 8 of the decapeptide sequence. The modification of Trp 3 by the 2,4-NAPS group led to a complete loss of the luteinizing hormone (LH) releasing as well as LH-release-inhibiting activity of the peptide. The [D-Lys(2,4-NAPS)] 6 analog was a very potent agonist that, after covalent attachment by photoaffinity labeling, caused prolonged LH secretion at a submaximal rate. [Orn(2,4-NAPS)] 8 -GnRH, a full agonist with a relative potency of 7% of GnRH, after photoaffinity labeling caused prolonged maximal LH release from cultured pituitary cells. In contrast, [Orn(2,5-NAPS)] 8 -GnRH, although being equipotent with the 2,4-NAPS isomer in terms of LH releasing ability, was unable to cause prolonged LH release after photoaffinity labeling. Thus, [Orn(2,4-NAPS)] 8 GnRH is very effective photolabeling ligand of the functionally significant pituitary GnRH receptor. Based on this compound, a pituitary peptidase resistant derivative, D-Phe 6 , [Orn(2,4-NAPS)] 8 -GnRH-(1-9)-ethylamide, was synthesized. This derivative showed high-affinity binding to pituitary membranes with a K/sub d/ comparable to those of other GnRH analogues. A radioiodinated form of this peptide was used for pituitary GnRH-receptor labeling. This derivative labeled 59- and 57-kDa proteins in rat and 58- and 56-kDa proteins in bovine pituitary membrane preparations, respectively. This peptide also labeled pituitary GnRH receptors in the solubilized state and therefore appears to be a suitable ligand for the isolation and further characterization of the receptor
International Nuclear Information System (INIS)
Wang, S; Chao, C; Chang, J
2014-01-01
Purpose: This study investigates the calibration error of detector sensitivity for MapCheck due to inaccurate positioning of the device, which is not taken into account by the current commercial iterative calibration algorithm. We hypothesize the calibration is more vulnerable to the positioning error for the flatten filter free (FFF) beams than the conventional flatten filter flattened beams. Methods: MapCheck2 was calibrated with 10MV conventional and FFF beams, with careful alignment and with 1cm positioning error during calibration, respectively. Open fields of 37cmx37cm were delivered to gauge the impact of resultant calibration errors. The local calibration error was modeled as a detector independent multiplication factor, with which propagation error was estimated with positioning error from 1mm to 1cm. The calibrated sensitivities, without positioning error, were compared between the conventional and FFF beams to evaluate the dependence on the beam type. Results: The 1cm positioning error leads to 0.39% and 5.24% local calibration error in the conventional and FFF beams respectively. After propagating to the edges of MapCheck, the calibration errors become 6.5% and 57.7%, respectively. The propagation error increases almost linearly with respect to the positioning error. The difference of sensitivities between the conventional and FFF beams was small (0.11 ± 0.49%). Conclusion: The results demonstrate that the positioning error is not handled by the current commercial calibration algorithm of MapCheck. Particularly, the calibration errors for the FFF beams are ~9 times greater than those for the conventional beams with identical positioning error, and a small 1mm positioning error might lead to up to 8% calibration error. Since the sensitivities are only slightly dependent of the beam type and the conventional beam is less affected by the positioning error, it is advisable to cross-check the sensitivities between the conventional and FFF beams to detect
International Nuclear Information System (INIS)
Kobyakov, Dmitriy Sergeevich; Avdalyan, Ashot Merudzhanovich; Lazarev, Aleksandr Fedorovich; Lushnikova, Elena Leonidovna; Nepomnyashchikh, Lev Moiseevich
2014-01-01
To evaluate the relation between argyrophilic nucleolar organizer region (AgNOR)-associated proteins and clinicopathological parameters and survival in non-small-cell lung cancer (NSCLC). A total of 207 surgical specimens diagnosed as NSCLC were included in this study. Double-staining procedures were performed using antigen Ki-67 (clone MIB-1) and silver nitrate by immunohistochemical and AgNOR-staining methods. The AgNOR area in MIB-1-positive cells of NSCLC is related to clinicopathological parameters under the TNM (tumor, node, and metastasis) system. The survival of patients with small AgNOR area in MIB-1-positive cells is better than that of patients with large AgNOR area. Molecular, biological (AgNOR area in MIB-1-positive cells), and clinicopathological (greatest tumor dimension, metastases to regional lymph nodes, histology, and differentiation) parameters are independent prognostic factors of NSCLC. The AgNOR area in MIB-1-positive cells is related to clinicopathological parameters and survival in NSCLC
Liu, Wei; Ding, Jinhui
2018-04-01
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
cMiCE a high resolution animal PET using continuous LSO with a statistics based positioning scheme
Joung Jin Hun; Lewellen, T K
2002-01-01
Objective: Detector designs for small animal scanners are currently dominated by discrete crystal implementations. However, given the small crystal cross-sections required to obtain very high resolution, discrete designs are typically expensive, have low packing fraction, reduced light collection, and are labor intensive to build. To overcome these limitations we have investigated the feasibility of using a continuous miniature crystal element (cMiCE) detector module for high resolution small animal PET applications. Methods: The detector module consists of a single continuous slab of LSO, 25x25 mm sup 2 in exposed cross-section and 4 mm thick, coupled directly to a PS-PMT (Hamamatsu R5900-00-C12). The large area surfaces of the crystal were polished and painted with TiO sub 2 and the short surfaces were left unpolished and painted black. Further, a new statistics based positioning (SBP) algorithm has been implemented to address linearity and edge effect artifacts that are inherent with conventional Anger sty...
Di Florio, Adriano
2017-10-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
Mirosław Mrozkowiak; Hanna Żukowska
2015-01-01
Mrozkowiak Mirosław, Żukowska Hanna. Znaczenie Dobrego Krzesła, jako elementu szkolnego i domowego środowiska ucznia, w profilaktyce zaburzeń statyki postawy ciała = The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions. Journal of Education, Health and Sport. 2015;5(7):179-215. ISSN 2391-8306. DOI 10.5281/zenodo.19832 http://ojs.ukw.edu.pl/index.php/johs/article/view/2015%3B5%287%29%3A179-215 https:...
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...
Geall, A J; Eaton, M A; Baker, T; Catterall, C; Blagbrough, I S
1999-10-15
We have quantified the effects of the regiochemical distribution of positive charges along the polyamine moiety in lipopolyamines for DNA molecular recognition. High affinity binding leads to charge neutralisation, DNA condensation and ultimately to lipofection. Binding affinities for calf thymus DNA were determined using an ethidium bromide displacement assay and condensation was detected by changes in turbidity using light scattering. The in vitro transfection competence of cholesterol polyamine carbamates was measured in CHO cells. In the design of DNA condensing and transfecting agents for non-viral gene therapy, the interrelationship of ammonium ions, not just their number, must be considered.
Mi, Tian; Merlin, Jerlin Camilus; Deverasetty, Sandeep; Gryk, Michael R; Bill, Travis J; Brooks, Andrew W; Lee, Logan Y; Rathnayake, Viraj; Ross, Christian A; Sargeant, David P; Strong, Christy L; Watts, Paula; Rajasekaran, Sanguthevar; Schiller, Martin R
2012-01-01
Minimotif Miner (MnM available at http://minimotifminer.org or http://mnm.engr.uconn.edu) is an online database for identifying new minimotifs in protein queries. Minimotifs are short contiguous peptide sequences that have a known function in at least one protein. Here we report the third release of the MnM database which has now grown 60-fold to approximately 300,000 minimotifs. Since short minimotifs are by their nature not very complex we also summarize a new set of false-positive filters and linear regression scoring that vastly enhance minimotif prediction accuracy on a test data set. This online database can be used to predict new functions in proteins and causes of disease.
cMiCE: a high resolution animal PET using continuous LSO with a statistics based positioning scheme
International Nuclear Information System (INIS)
Joung Jinhun; Miyaoka, R.S.; Lewellen, T.K.
2002-01-01
Objective: Detector designs for small animal scanners are currently dominated by discrete crystal implementations. However, given the small crystal cross-sections required to obtain very high resolution, discrete designs are typically expensive, have low packing fraction, reduced light collection, and are labor intensive to build. To overcome these limitations we have investigated the feasibility of using a continuous miniature crystal element (cMiCE) detector module for high resolution small animal PET applications. Methods: The detector module consists of a single continuous slab of LSO, 25x25 mm 2 in exposed cross-section and 4 mm thick, coupled directly to a PS-PMT (Hamamatsu R5900-00-C12). The large area surfaces of the crystal were polished and painted with TiO 2 and the short surfaces were left unpolished and painted black. Further, a new statistics based positioning (SBP) algorithm has been implemented to address linearity and edge effect artifacts that are inherent with conventional Anger style positioning schemes. To characterize the light response function (LRF) of the detector, data were collected on a coarse grid using a highly collimated coincidence setup. The LRF was then estimated using cubic spline interpolation. Detector performance has been evaluated for both SBP and Anger based decoding using measured data and Monte Carlo simulations. Results: Using the SBP scheme, edge artifacts were successfully handled. Simulation results show that the useful field of view (UFOV) was extended to ∼22x22 mm 2 with an average point spread function of ∼0.5 mm full width of half maximum (FWHM PSF ). For the same detector with Anger decoding the UFOV of the detector was ∼16x16 mm 2 with an average FWHM PSP of ∼0.9 mm. Experimental results yielded similar differences between FOV and resolution performance. FWHM PSF for the SBP and Anger based method was 1.4 and 2.0 mm, uncorrected for source size, with a 1 mm diameter point source, respectively. Conclusion
Garces, E. L.; Garces, M. A.; Christe, A.
2017-12-01
The RedVox infrasound recorder app uses microphones and barometers in smartphones to record infrasound, low-frequency sound below the threshold of human hearing. We study a device's metadata, which includes position, latency time, the differences between the device's internal times and the server times, and the machine time, searching for patterns and possible errors or discontinuities in these scaled parameters. We highlight metadata variability through scaled multivariate displays (histograms, distribution curves, scatter plots), all created and organized through software development in Python. This project is helpful in ascertaining variability and honing the accuracy of smartphones, aiding the emergence of portable devices as viable geophysical data collection instruments. It can also improve the app and cloud service by increasing efficiency and accuracy, allowing to better document and foresee drastic natural movements like tsunamis, earthquakes, volcanic eruptions, storms, rocket launches, and meteor impacts; recorded data can later be used for studies and analysis by a variety of professions. We expect our final results to produce insight on how to counteract problematic issues in data mining and improve accuracy in smartphone data-collection. By eliminating lurking variables and minimizing the effect of confounding variables, we hope to discover efficient processes to reduce superfluous precision, unnecessary errors, and data artifacts. These methods should conceivably be transferable to other areas of software development, data analytics, and statistics-based experiments, contributing a precedent of smartphone metadata studies from geophysical rather than societal data. The results should facilitate the rise of civilian-accessible, hand-held, data-gathering mobile sensor networks and yield more straightforward data mining techniques.
Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu
2015-09-21
Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.
Null, Cynthia H.
2009-01-01
In June 2004, the June Space Flight Leadership Council (SFLC) assigned an action to the NASA Engineering and Safety Center (NESC) and External Tank (ET) project jointly to characterize the available dataset [of defect sizes from dissections of foam], identify resultant limitations to statistical treatment of ET as-built foam as part of the overall thermal protection system (TPS) certification, and report to the Program Requirements Change Board (PRCB) and SFLC in September 2004. The NESC statistics team was formed to assist the ET statistics group in August 2004. The NESC's conclusions are presented in this report.
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.
2012-12-01
Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and
Kim, Yee Suk; Lee, Sungin; Zong, Nansu; Kahng, Jimin
2017-01-01
The present study aimed to investigate differences in prognosis based on human papillomavirus (HPV) infection, persistent infection and genotype variations for patients exhibiting atypical squamous cells of undetermined significance (ASCUS) in their initial Papanicolaou (PAP) test results. A latent Dirichlet allocation (LDA)-based tool was developed that may offer a facilitated means of communication to be employed during patient-doctor consultations. The present study assessed 491 patients (139 HPV-positive and 352 HPV-negative cases) with a PAP test result of ASCUS with a follow-up period ≥2 years. Patients underwent PAP and HPV DNA chip tests between January 2006 and January 2009. The HPV-positive subjects were followed up with at least 2 instances of PAP and HPV DNA chip tests. The most common genotypes observed were HPV-16 (25.9%, 36/139), HPV-52 (14.4%, 20/139), HPV-58 (13.7%, 19/139), HPV-56 (11.5%, 16/139), HPV-51 (9.4%, 13/139) and HPV-18 (8.6%, 12/139). A total of 33.3% (12/36) patients positive for HPV-16 had cervical intraepithelial neoplasia (CIN)2 or a worse result, which was significantly higher than the prevalence of CIN2 of 1.8% (8/455) in patients negative for HPV-16 (Paged ≥51 years (38.7%) than in those aged ≤50 years (20.4%; P=0.036). Progression from persistent infection to CIN2 or worse (19/34, 55.9%) was higher than clearance (0/105, 0.0%; Page and long infection period with a clinical progression of CIN2 or worse. Therefore, LDA results may be presented as explanatory evidence during time-constrained patient-doctor consultations in order to deliver information regarding the patient's status. PMID:28587376
Dubois, Albertine; Hérard, Anne-Sophie; Delatour, Benoît; Hantraye, Philippe; Bonvento, Gilles; Dhenain, Marc; Delzescaux, Thierry
2010-06-01
Biomarkers and technologies similar to those used in humans are essential for the follow-up of Alzheimer's disease (AD) animal models, particularly for the clarification of mechanisms and the screening and validation of new candidate treatments. In humans, changes in brain metabolism can be detected by 1-deoxy-2-[(18)F] fluoro-D-glucose PET (FDG-PET) and assessed in a user-independent manner with dedicated software, such as Statistical Parametric Mapping (SPM). FDG-PET can be carried out in small animals, but its resolution is low as compared to the size of rodent brain structures. In mouse models of AD, changes in cerebral glucose utilization are usually detected by [(14)C]-2-deoxyglucose (2DG) autoradiography, but this requires prior manual outlining of regions of interest (ROI) on selected sections. Here, we evaluate the feasibility of applying the SPM method to 3D autoradiographic data sets mapping brain metabolic activity in a transgenic mouse model of AD. We report the preliminary results obtained with 4 APP/PS1 (64+/-1 weeks) and 3 PS1 (65+/-2 weeks) mice. We also describe new procedures for the acquisition and use of "blockface" photographs and provide the first demonstration of their value for the 3D reconstruction and spatial normalization of post mortem mouse brain volumes. Despite this limited sample size, our results appear to be meaningful, consistent, and more comprehensive than findings from previously published studies based on conventional ROI-based methods. The establishment of statistical significance at the voxel level, rather than with a user-defined ROI, makes it possible to detect more reliably subtle differences in geometrically complex regions, such as the hippocampus. Our approach is generic and could be easily applied to other biomarkers and extended to other species and applications. Copyright 2010 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Saeki, A.; Tagawa, S.; Kozawa, T.; Yoshida, Y.
2003-01-01
Time-dependent behaviors of radical cation in n-dodecane in the presence of high-concentrated cation scavenger triethylamine were measured by subpicosecond pulse radiolysis system. The significant reduction of the initial yield in the optical density was observed. This reduction were not able to be explained by the first order rate constant. Therefore, we assumed that this phenomena occur due to the adjacent effect of the solute molecules. We approached this effect by the statistical model and configurational-bias Monte Carlo method. In both methods, we supposed a condition that the cation site in the radical cation is delocalized and will be scavenged rapidly within the time resolution if the solute molecules is adjacent to any sites of the solvent. In addition to the adjacent effect, the fact that a large part of the solvent molecules is excluded by the solute molecules especially at high concentration was taken into consideration. First, we formulated this effect by a statistical model. In addition to the above assumption, this model is based on the following assumption; the effects of molecule's shape, conformation and interaction among molecules were ignored and the aggregation of the solute molecules were treated randomly. As a result, the formula indicated good agreement with the experimental data. Second, as another approach, we adopted the configurational-bias Monte Carlo simulation to reproduce the liquid system. The OLPS model was used to describe the intermolecular and intramolecular potentials. The adjacent effect estimated by this method corresponded to the experimental data with a threshold of 0.5 nm. This value are close to a typical reaction radius. The average number of adjacent solvent molecules and the distribution of aggregated solute's number were also collected from the position data
Energy Technology Data Exchange (ETDEWEB)
Dai Kubicky, Charlotte, E-mail: charlottedai@gmail.com [Department of Radiation Medicine and Knight Cancer Institute, Oregon Health and Science University, Portland, Oregon (United States); Mongoue-Tchokote, Solange [Biostatistics Shared Resource, Knight Cancer Institute, Oregon Health and Science University, Portland, Oregon (United States)
2013-04-01
Purpose: To determine whether patients with 1, 2, or 3 positive lymph nodes (LNs) have similar survival outcomes. Methods and Materials: We analyzed the Surveillance, Epidemiology, and End Results registry of breast cancer patients diagnosed between 1990 and 2003. We identified 10,415 women with T1-2N1M0 breast cancer who were treated with mastectomy with no adjuvant radiation, with at least 10 LNs examined and 6 months of follow-up. The Kaplan-Meier method and log–rank test were used for survival analysis. Multivariate analysis was performed using the Cox proportional hazard model. Results: Median follow-up was 92 months. Ten-year overall survival (OS) and cause-specific survival (CSS) were progressively worse with increasing number of positive LNs. Survival rates were 70%, 64%, and 60% (OS), and 82%, 76%, and 72% (CSS) for 1, 2, and 3 positive LNs, respectively. Pairwise log–rank test P values were <.001 (1 vs 2 positive LNs), <.001 (1 vs 3 positive LNs), and .002 (2 vs 3 positive LNs). Multivariate analysis showed that number of positive LNs was a significant predictor of OS and CSS. Hazard ratios increased with the number of positive LNs. In addition, age, primary tumor size, grade, estrogen receptor and progesterone receptor status, race, and year of diagnosis were significant prognostic factors. Conclusions: Our study suggests that patients with 1, 2, and 3 positive LNs have distinct survival outcomes, with increasing number of positive LNs associated with worse OS and CSS. The conventional grouping of 1-3 positive LNs needs to be reconsidered.
Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M.; Wolters, Lidewij H.; Huizenga, Hilde M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
Energy Technology Data Exchange (ETDEWEB)
Zhang, Pei, E-mail: pei.zhang@desy.de [School of Physics and Astronomy, The University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Deutsches Elektronen-Synchrotron (DESY), Notkestraße 85, D-22607 Hamburg (Germany); Cockcroft Institute of Science and Technology, Daresbury WA4 4AD (United Kingdom); Baboi, Nicoleta [Deutsches Elektronen-Synchrotron (DESY), Notkestraße 85, D-22607 Hamburg (Germany); Jones, Roger M. [School of Physics and Astronomy, The University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Cockcroft Institute of Science and Technology, Daresbury WA4 4AD (United Kingdom)
2014-01-11
Beam-excited higher order modes (HOMs) can be used to provide beam diagnostics. Here we focus on 3.9 GHz superconducting accelerating cavities. In particular we study dipole mode excitation and its application to beam position determinations. In order to extract beam position information, linear regression can be used. Due to a large number of sampling points in the waveforms, statistical methods are used to effectively reduce the dimension of the system, such as singular value decomposition (SVD) and k-means clustering. These are compared with the direct linear regression (DLR) on the entire waveforms. A cross-validation technique is used to study the sample independent precisions of the position predictions given by these three methods. A RMS prediction error in the beam position of approximately 50 μm can be achieved by DLR and SVD, while k-means clustering suggests 70 μm.
Zhang, P; Jones, R M
2014-01-01
Beam-excited higher order modes (HOM) can be used to provide beam diagnostics. Here we focus on 3.9 GHz superconducting accelerating cavities. In particular we study dipole mode excitation and its application to beam position determinations. In order to extract beam position information, linear regression can be used. Due to a large number of sampling points in the waveforms, statistical methods are used to effectively reduce the dimension of the system, such as singular value decomposition (SVD) and k-means clustering. These are compared with the direct linear regression (DLR) on the entire waveforms. A cross-validation technique is used to study the sample independent precisions of the position predictions given by these three methods. A RMS prediction error in the beam position of approximately 50 micron can be achieved by DLR and SVD, while k-means clustering suggests 70 micron.
Energy Technology Data Exchange (ETDEWEB)
Tamada, Masao; Kasai, Noboru; Seko, Noriaki; Hasegawa, Shin; Takeda, Hayato; Katakai, Akio; Sugo, Takanobu [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment; Kawabata, Yukiya [Ebara Reseach Co., Ltd., Fujisawa, Kanagawa (Japan); Onuma, Kenji [Mitsubishi Materials Corp., Tokyo (Japan)
2001-11-01
Positioning and monitoring system for marine component in recovery equipment of significant metals from seawater with adsorbent was designed and assembled to avoid unexpected drifting accident. This system which was set on float part of the marine component obtains the positioning data from GPS satellites and sends them to Takasaki and Mutsu establishments through satellite communication. In both establishments, the position data were shown in computer displays. As characteristic test for 20 days in the real sea, 262 data were obtained every 2 hours. The twice of the distance root mean square (2DRMS) was 223.7 m. To improve this performance, three new functions were added to the present firmware. There are to raise positioning resolutions in longitude and latitude from 0.001 to 0.00001 degree, to remove the reflection of GPS signal from sea surface, and to average remaining three positioning data after maximum and minimum data were omitted from continuous five positioning data. The improved system shows the 2DRMS positioning of 15.5 m. This performance is enough to prevent marine component from its drifting accident. (author)
Kordy, Faisal; Richardson, Susan E; Stephens, Derek; Lam, Ray; Jamieson, Frances; Kitai, Ian
2015-01-01
In countries with low rates of tuberculosis (TB), yields of gastric aspirates (GAs) for Mycobacterium tuberculosis (MTB) culture are low. The significance of non-tuberculous mycobacteria (NTM) isolated from GA is uncertain. We reviewed clinical, microbiologic and radiologic data for children who underwent GA between 1999 and 2011 at Sick Kids, Toronto. Radiologic features of cases were compared with those of age matched controls. 785 GAs were obtained from 285 patients of whom 20 (7%) had positive MTB cultures: in 15 patients the GA was the only positive culture for MTB. Of 15 culture-positive patients who underwent exactly 3 GAs, MTB was isolated from the first lavage in 10 (67%), only from the second in 3 (20%) and only from the third in 2 (13%). On univariate analysis, miliary disease and intrathoracic lymphadenopathy were associated with a positive GA MTB culture. On multiple conditional logistic regression analysis, adenopathy remained significant (OR 10.2 [95% CI 2.0-51.4] p =0.005). Twelve patients had NTM isolated, most commonly M. avium complex: none had evidence of invasive NTM disease during a median duration of 12 months of follow-up. Causal pathogens different from the GA NTM culture were isolated from biopsies or bronchoalveolar lavage in 3. GAs continue to be important for TB diagnosis in children. Three GAs have a yield better than 1. Those with miliary or disseminated TB and intrathoracic lymphadenopathy have highest yields. NTM isolates from GA are likely unimportant and can be clinically misleading.
Binny, Diana; Lancaster, Craig M; Trapp, Jamie V; Crowe, Scott B
2017-09-01
This study utilizes process control techniques to identify action limits for TomoTherapy couch positioning quality assurance tests. A test was introduced to monitor accuracy of the applied couch offset detection in the TomoTherapy Hi-Art treatment system using the TQA "Step-Wedge Helical" module and MVCT detector. Individual X-charts, process capability (cp), probability (P), and acceptability (cpk) indices were used to monitor a 4-year couch IEC offset data to detect systematic and random errors in the couch positional accuracy for different action levels. Process capability tests were also performed on the retrospective data to define tolerances based on user-specified levels. A second study was carried out whereby physical couch offsets were applied using the TQA module and the MVCT detector was used to detect the observed variations. Random and systematic variations were observed for the SPC-based upper and lower control limits, and investigations were carried out to maintain the ongoing stability of the process for a 4-year and a three-monthly period. Local trend analysis showed mean variations up to ±0.5 mm in the three-monthly analysis period for all IEC offset measurements. Variations were also observed in the detected versus applied offsets using the MVCT detector in the second study largely in the vertical direction, and actions were taken to remediate this error. Based on the results, it was recommended that imaging shifts in each coordinate direction be only applied after assessing the machine for applied versus detected test results using the step helical module. User-specified tolerance levels of at least ±2 mm were recommended for a test frequency of once every 3 months to improve couch positional accuracy. SPC enables detection of systematic variations prior to reaching machine tolerance levels. Couch encoding system recalibrations reduced variations to user-specified levels and a monitoring period of 3 months using SPC facilitated in detecting
Ohlendorf, D; Riegel, M; Lin Chung, T; Kopp, S
2013-01-01
The purpose of this study was to investigate the effects on postural stability of two different lower jaw positions held in place by splints with eyes open and eyes closed. The postural stability in 21 healthy adult volunteers was investigated using two different sets of occlusal conditions with the lower jaw being at rest either with the eyes opened or closed. Two occlusal splints (standard splint and DPS splint) were used in order to maintain this lower jaw position. The balance behaviour was recorded using a balance platform. In a comparison of the habitual occlusion with the two occlusal splints, the balance posturographic values with the eyes opened fell between 7-9% and those for weight distribution with the eyes closed between 22-26% (with greater improvement being achieved with DPS) with the result that the variability in the range of fluctuations was reduced. The level of positioning accuracy deteriorated with the wearing of a splint between 13% with the DPS splint and 30% with the standard splint. Gender-specific differences of minor importance in relation to the positioning accuracy were recorded, with there being significant differences in the female participants (P≤0.00). An occlusal change in the stomatognathic system impacts on postural stability. Balance deficits seem to correlate with deteriorated body sway, which, according to the results, can be improved by a myocentric bite position using a DPS splint. This is more the case with the eyes closed than with the eyes opened.
Sareen, Nishtha; Baber, Usman; Kezbor, Safwan; Sayseng, Sonny; Aquino, Melissa; Mehran, Roxana; Sweeny, Joseph; Barman, Nitin; Kini, Annapoorna; Sharma, Samin K
2017-04-07
Coronary revascularisation based upon physiological evaluation of lesions improves clinical outcomes. Angiographic or visual stenosis assessment alone is insufficient in predicting haemodynamic stenosis severity by fractional flow reserve (FFR) and therefore cannot be used to guide revascularisation, particularly in the lesion subset system formulated. Of 1,023 consecutive lesions (883 patients), 314 (31%) were haemodynamically significant. Characteristics associated with FFR ≤0.8 include male gender, higher SYNTAX score, lesions ≥20 mm, stenosis >50%, bifurcation, calcification, absence of tortuosity and smaller reference diameter. A user-friendly integer score was developed with the five variables demonstrating the strongest association. On prospective validation (in 279 distinct lesions), the increasing value of the score correlated well with increasing haemodynamic significance (C-statistic 0.85). We identified several clinical and angiographic characteristics and formulated a scoring system to guide the approach to intermediate lesions. This may translate into cost savings. Larger studies with prospective validation are required to confirm our results.
Mushtaq, Ammara; Chen, Derrick J; Strand, Gregory J; Dylla, Brenda L; Cole, Nicolynn C; Mandrekar, Jayawant; Patel, Robin
2016-07-01
With the advent of matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS), most Gram-positive rods (GPRs) are readily identified; however, their clinical relevance in blood cultures remains unclear. Herein, we assessed the clinical significance of GPRs isolated from blood and identified in the era of MALDI-TOF MS. A retrospective chart review of patients presenting to the Mayo Clinic, Rochester, MN, from January 1, 2013, to October 13, 2015, was performed. Any episode of a positive blood culture for a GPR was included. We assessed the number of bottles positive for a given isolate, time to positivity of blood cultures, patient age, medical history, interpretation of culture results by the healthcare team and whether infectious diseases consultation was obtained. We also evaluated the susceptibility profiles of a larger collection of GPRs tested in the clinical microbiology laboratory of the Mayo Clinic, Rochester, MN from January 1, 2013, to October 31, 2015. There were a total of 246 GPRs isolated from the blood of 181 patients during the study period. 56% (n = 101) were deemed contaminants by the healthcare team and were not treated; 33% (n = 59) were clinically determined to represent true bacteremia and were treated; and 8% (n = 14) were considered of uncertain significance, with patients prescribed treatment regardless. Patient characteristics associated with an isolate being treated on univariate analysis included younger age (P = 0.02), identification to the species level (P = 0.02), higher number of positive blood culture sets (P < 0.0001), lower time to positivity (P < 0.0001), immunosuppression (P = 0.03), and recommendation made by an infectious disease consultant (P = 0.0005). On multivariable analysis, infectious diseases consultation (P = 0.03), higher number of positive blood culture sets (P = 0.0005) and lower time to positivity (P = 0.03) were associated with an isolate being treated. 100, 83, 48 and 34% of GPRs
Directory of Open Access Journals (Sweden)
Hickethier T
2018-05-01
Full Text Available Tilman Hickethier,1,* Kamal Mammadov,1,* Bettina Baeßler,1 Thorsten Lichtenstein,1 Jochen Hinkelbein,2 Lucy Smith,3 Patrick Sven Plum,4 Seung-Hun Chon,4 David Maintz,1 De-Hua Chang1 1Department of Radiology, University Hospital of Cologne, Cologne, Germany; 2Department of Anesthesiology and Intensive Care Medicine, University Hospital of Cologne, Cologne, Germany; 3Faculty of Medicine, Memorial University of Newfoundland, St. John’s, Canada; 4Department of General, Visceral and Cancer Surgery, University Hospital of Cologne, Cologne, Germany *These authors contributed equally to this work Background: The study was conducted to compare examination time and artifact vulnerability of whole-body computed tomographies (wbCTs for trauma patients using conventional or optimized patient positioning. Patients and methods: Examination time was measured in 100 patients scanned with conventional protocol (Group A: arms positioned alongside the body for head and neck imaging and over the head for trunk imaging and 100 patients scanned with optimized protocol (Group B: arms flexed on a chest pillow without repositioning. Additionally, influence of two different scanning protocols on image quality in the most relevant body regions was assessed by two blinded readers. Results: Total wbCT duration was about 35% or 3:46 min shorter in B than in A. Artifacts in aorta (27 vs 6%, liver (40 vs 8% and spleen (27 vs 5% occurred significantly more often in B than in A. No incident of non-diagnostic image quality was reported, and no significant differences for lungs and spine were found. Conclusion: An optimized wbCT positioning protocol for trauma patients allows a significant reduction of examination time while still maintaining diagnostic image quality. Keywords: CT scan, polytrauma, acute care, time requirement, positioning
Sy, Bui Tien; Ratsch, Boris A.; Toan, Nguyen Linh; Song, Le Huu; Wollboldt, Christian; Bryniok, Agnes; Nguyen, Hung Minh; Luong, Hoang Van; Velavan, Thirumalaisamy P.; Wedemeyer, Heiner; Kremsner, Peter G.; Bock, C.-Thomas
2013-01-01
Background Hepatitis D virus (HDV) infection is considered to cause more severe hepatitis than hepatitis B virus (HBV) monoinfection. With more than 9.5 million HBV-infected people, Vietnam will face an enormous health burden. The prevalence of HDV in Vietnamese HBsAg-positive patients is speculative. Therefore, we assessed the prevalence of HDV in Vietnamese patients, determined the HDV-genotype distribution and compared the findings with the clinical outcome. Methods 266 sera of well-characterized HBsAg-positive patients in Northern Vietnam were analysed for the presence of HDV using newly developed HDV-specific RT-PCRs. Sequencing and phylogenetic analysis were performed for HDV-genotyping. Results The HDV-genome prevalence observed in the Vietnamese HBsAg-positive patients was high with 15.4% while patients with acute hepatitis showed 43.3%. Phylogenetic analysis demonstrated a predominance of HDV-genotype 1 clustering in an Asian clade while HDV-genotype 2 could be also detected. The serum aminotransferase levels (AST, ALT) as well as total and direct bilirubin were significantly elevated in HDV-positive individuals (p<0.05). HDV loads were mainly low (<300 to 4.108 HDV-copies/ml). Of note, higher HDV loads were mainly found in HBV-genotype mix samples in contrast to single HBV-infections. In HBV/HDV-coinfections, HBV loads were significantly higher in HBV-genotype C in comparison to HBV-genotype A samples (p<0.05). Conclusion HDV prevalence is high in Vietnamese individuals, especially in patients with acute hepatitis B. HDV replication activity showed a HBV-genotype dependency and could be associated with elevated liver parameters. Besides serological assays molecular tests are recommended for diagnosis of HDV. Finally, the high prevalence of HBV and HDV prompts the urgent need for HBV-vaccination coverage. PMID:24205106
Directory of Open Access Journals (Sweden)
Garth L. Nicolson
2012-03-01
Full Text Available Background: An open label 8-week preliminary study was conducted in a small number of patients to determine if a combination oral supplement containing a mixture of phosphoglycolipids, coenzyme Q10 and microencapsulated NADH and other nutrients could affect fatigue levels in long-term, Western blot-positive, multi-symptom ‘chronic Lyme disease’ patients (also called ‘post-treatment Lyme disease’ or ‘post Lyme syndrome’ with intractable fatigue. Methods: The subjects in this study were 6 males (mean age = 45.1 ± 12.4 years and 10 females (mean age = 54.6 ± 7.4 years with ‘chronic Lyme disease’ (determined by multiple symptoms and positive Western blot analysis that had been symptomatic with chronic fatigue for an average of 12.7 ± 6.6 years. They had been seen by multiple physicians (13.3 ± 7.6 and had used many other remedies, supplements and drugs (14.4 ± 7.4 without fatigue relief. Fatigue was monitored at 0, 7, 30 and 60 days using a validated instrument, the Piper Fatigue Scale.Results: Patients in this preliminary study responded to the combination test supplement, showing a 26% reduction in overall fatigue by the end of the 8-week trial (p< 0.0003. Analysis of subcategories of fatigue indicated that there were significant improvements in the ability to complete tasks and activities as well as significant improvements in mood and cognitive abilities. Regression analysis of the data indicated that reductions in fatigue were consistent and occurred with a high degree of confidence (R2= 0.998. Functional Foods in Health and Disease 2012, 2(3:35-47 Conclusions: The combination supplement was a safe and effective method to significantly reduce intractable fatigue in long-term patients with Western blot-positive ‘chronic Lyme disease.’
Directory of Open Access Journals (Sweden)
LI Qiang
2016-07-01
Full Text Available Objective To investigate the independent predictive factors for significant liver histological changes (SLHCs in patients with HBeAg-positive high-viral-load chronic hepatitis B virus (HBV infection and a normal alanine aminotransferase (ALT level. MethodsA retrospective analysis was performed on the clinical data of 116 previously untreated patients with HBeAg-positive high-viral-load (HBV DNA≥105 copies/ml chronic HBV infection and a normal ALT level (＜50 U/L who were hospitalized in Shanghai Public Health Clinical Center Affiliated to Fudan University from June 2013 to August 2015. The definition of SLHCs was inflammation ≥G2 and/or fibrosis≥S2. The t-test or Mann-Whitney U rank sum test was used for comparison of continuous data between groups, and the chi-square test was used for comparison of categorical data between groups. Univariate and multivariate regression analyses were used to determine independent predictive factors for SLHCs. ResultsOf all the 116 patients, 47(40.5% had SLHCs. The multivariate analysis showed that age (OR=2.828, P＜0.05, ALT (OR=1.011, P＜0.05, and gamma-glutamyl transpeptidase (GGT (OR=1.089, P＜0.05 were independent predictors for SLHCs in patients with HBeAg-positive high-viral-load chronic HBV infection and a normal ALT level. The patients aged ≤30 years had a significantly lower incidence rate of SLHCs than those aged＞30 years (21.6% vs 49.4%, χ2=6.42, P=0.015, the patients with ALT ≤30 U/L had a significantly lower incidence rate of SLHCs than those with 30 U/L＜ALT≤50 U/L (17.6% vs 50.0%, χ2=19.86, P＜0.001, and the patients with GGT≤40 U/L had a significantly lower incidence rate of SLHCs than those with GGT＞40 U/L (28.8% vs 66.7%, χ2=28.63, P＜0.001. ConclusionIn patients with HBeAg-positive high-viral-load chronic HBV infection and a normal ALT level, those with an age of＞30 years, ALT＞30 U/L, and GGT＞40 U/L tend to develop SLHCs and need liver biopsy.
International Nuclear Information System (INIS)
Daziano, C.
2010-01-01
Statistical analysis of trace elements in volcanics research s, allowed to distinguish two independent populations with the same geochemical environment. For each component they have variable index of homogeneity resulting in dissimilar average values that reveal geochemical intra telluric phenomena. On the other hand the inhomogeneities observed in these rocks - as reflected in its petrochemical characters - could be exacerbated especially at so remote and dispersed location of their pitches, their relations with the enclosing rocks for the ranges of compositional variation, due differences relative ages
Vrazo, Alexandra C; Firth, Jacqueline; Amzel, Anouk; Sedillo, Rebecca; Ryan, Julia; Phelps, B Ryan
2018-02-01
Despite the success of Prevention of Mother-to-Child Transmission of HIV (PMTCT) programmes, low uptake of services and poor retention pose a formidable challenge to achieving the elimination of vertical HIV transmission in low- and middle-income countries. This systematic review summarises interventions that demonstrate statistically significant improvements in service uptake and retention of HIV-positive pregnant and breastfeeding women and their infants along the PMTCT cascade. Databases were systematically searched for peer-reviewed studies. Outcomes of interest included uptake of services, such as antiretroviral therapy (ART) such as initiation, early infant diagnostic testing, and retention of HIV-positive pregnant and breastfeeding women and their infants. Interventions that led to statistically significant outcomes were included and mapped to the PMTCT cascade. An eight-item assessment tool assessed study rigour. CRD42017063816. Of 686 citations reviewed, 11 articles met inclusion criteria. Ten studies detailed maternal outcomes and seven studies detailed infant outcomes in PMTCT programmes. Interventions to increase access to antenatal care (ANC) and ART services (n = 4) and those using lay cadres (n = 3) were most common. Other interventions included quality improvement (n = 2), mHealth (n = 1), and counselling (n = 1). One study described interventions in an Option B+ programme. Limitations included lack of HIV testing and counselling and viral load monitoring outcomes, small sample size, geographical location, and non-randomized assignment and selection of participants. Interventions including ANC/ART integration, family-centred approaches, and the use of lay healthcare providers are demonstrably effective in increasing service uptake and retention of HIV-positive mothers and their infants in PMTCT programmes. Future studies should include control groups and assess whether interventions developed in the context of earlier 'Options' are
International Nuclear Information System (INIS)
Leissner, Philippe; Verjat, Thibault; Bachelot, Thomas; Paye, Malick; Krause, Alexander; Puisieux, Alain; Mougin, Bruno
2006-01-01
One of the most thoroughly studied systems in relation to its prognostic relevance in patients with breast cancer, is the plasminogen activation system that comprises of, among others, the urokinase Plasminogen Activator (uPA) and its main inhibitor, the Plasminogen Activator Inhibitor-1 (PAI-1). In this study, we investigated the prognostic value of uPA and PAI-1 at the mRNA level in lymph node- and hormone receptor-positive breast cancer. The study included a retrospective series of 87 patients with hormone-receptor positive and axillary lymph node-positive breast cancer. All patients received radiotherapy, adjuvant anthracycline-based chemotherapy and five years of tamoxifen treatment. The median patient age was 54 and the median follow-up time was 79 months. Distant relapse occurred in 30 patients and 22 patients died from breast cancer during follow-up. We investigated the prognostic value of uPA and PAI-1 at the mRNA level as measured by real-time quantitative RT-PCR. uPA and PAI-1 gene expression was not found to be correlated with any of the established clinical and pathological factors. Metastasis-free Survival (MFS) and Breast Cancer specific Survival (BCS) were significantly shorter in patients expressing high levels of PAI-1 mRNA (p < 0.0001; p < 0.0001; respectively). In Cox multivariate analysis, the level of PAI-1 mRNA appeared to be the strongest prognostic factor for MFS (Hazard Ratio (HR) = 10.12; p = 0.0002) and for BCS (HR = 13.17; p = 0.0003). Furthermore, uPA gene expression was not significantly associated neither with MFS (p = 0.41) nor with BCS (p = 0.19). In a Cox-multivariate regression analysis, uPA expression did not demonstrate significant independent prognostic value. These findings indicate that high PAI-1 mRNA expression represents a strong and independent unfavorable prognostic factor for the development of metastases and for breast cancer specific survival in a population of hormone receptor- and lymph node-positive breast cancer
Directory of Open Access Journals (Sweden)
Krause Alexander
2006-08-01
Full Text Available Abstract Background One of the most thoroughly studied systems in relation to its prognostic relevance in patients with breast cancer, is the plasminogen activation system that comprises of, among others, the urokinase Plasminogen Activator (uPA and its main inhibitor, the Plasminogen Activator Inhibitor-1 (PAI-1. In this study, we investigated the prognostic value of uPA and PAI-1 at the mRNA level in lymph node- and hormone receptor-positive breast cancer. Methods The study included a retrospective series of 87 patients with hormone-receptor positive and axillary lymph node-positive breast cancer. All patients received radiotherapy, adjuvant anthracycline-based chemotherapy and five years of tamoxifen treatment. The median patient age was 54 and the median follow-up time was 79 months. Distant relapse occurred in 30 patients and 22 patients died from breast cancer during follow-up. We investigated the prognostic value of uPA and PAI-1 at the mRNA level as measured by real-time quantitative RT-PCR. Results uPA and PAI-1 gene expression was not found to be correlated with any of the established clinical and pathological factors. Metastasis-free Survival (MFS and Breast Cancer specific Survival (BCS were significantly shorter in patients expressing high levels of PAI-1 mRNA (p PAI-1 mRNA appeared to be the strongest prognostic factor for MFS (Hazard Ratio (HR = 10.12; p = 0.0002 and for BCS (HR = 13.17; p = 0.0003. Furthermore, uPA gene expression was not significantly associated neither with MFS (p = 0.41 nor with BCS (p = 0.19. In a Cox-multivariate regression analysis, uPA expression did not demonstrate significant independent prognostic value. Conclusion These findings indicate that high PAI-1 mRNA expression represents a strong and independent unfavorable prognostic factor for the development of metastases and for breast cancer specific survival in a population of hormone receptor- and lymph node-positive breast cancer patients.
Ge, Hao; Wu, Pingping; Qian, Hong; Xie, Xiaoliang Sunney
2018-03-01
Within an isogenic population, even in the same extracellular environment, individual cells can exhibit various phenotypic states. The exact role of stochastic gene-state switching regulating the transition among these phenotypic states in a single cell is not fully understood, especially in the presence of positive feedback. Recent high-precision single-cell measurements showed that, at least in bacteria, switching in gene states is slow relative to the typical rates of active transcription and translation. Hence using the lac operon as an archetype, in such a region of operon-state switching, we present a fluctuating-rate model for this classical gene regulation module, incorporating the more realistic operon-state switching mechanism that was recently elucidated. We found that the positive feedback mechanism induces bistability (referred to as deterministic bistability), and that the parameter range for its occurrence is significantly broadened by stochastic operon-state switching. We further show that in the absence of positive feedback, operon-state switching must be extremely slow to trigger bistability by itself. However, in the presence of positive feedback, which stabilizes the induced state, the relatively slow operon-state switching kinetics within the physiological region are sufficient to stabilize the uninduced state, together generating a broadened parameter region of bistability (referred to as stochastic bistability). We illustrate the opposite phenotype-transition rate dependence upon the operon-state switching rates in the two types of bistability, with the aid of a recently proposed rate formula for fluctuating-rate models. The rate formula also predicts a maximal transition rate in the intermediate region of operon-state switching, which is validated by numerical simulations in our model. Overall, our findings suggest a biological function of transcriptional "variations" among genetically identical cells, for the emergence of bistability and
Directory of Open Access Journals (Sweden)
Shimpei Ogawa
2009-04-01
Full Text Available A discussion of the significance of F-fluorodeoxyglucose positron emission tomography (FDG-PET in the identification of diseases of the appendix is presented based on two cases falsely positive for FDG accumulation. Both cases were palpable for a tumor in the lower right abdominal region and a prominently enlarged appendix was depicted by CT. Although the patients underwent ileocecal resection based on a strong suspicion of appendiceal cancer rather than appendicitis since abnormal accumulation exhibiting maximum standard uptake values (SUVs of 7.27 and 17.11, respectively, was observed at the same site in FDG-PET examination and since there no malignant findings observed histologically, the patients were diagnosed with appendicitis. Although FDG specifically accumulates not only in malignant tumors, but also in diseases such as acute or chronic inflammation, abscesses and lymphadenitis, and identification based on SUVs has been reported to be used as a method of identification, the two cases reported here were both false-positive cases exhibiting high maximum SUVs. At the present time, although the significance of FDG-PET in the identification of diseases of the appendix is somewhat low and there are limitations on its application, various research is currently being conducted with the aim of improving diagnostic accuracy, and it is hoped that additional studies will be conducted in the future.
Directory of Open Access Journals (Sweden)
Yohei Fujita
Full Text Available PURPOSE: We investigated whether serum interleukin (IL-8 reflects the tumor microenvironment and has prognostic value in patients with oral squamous cell carcinoma (OSCC. EXPERIMENTAL DESIGN: Fifty OSCC patients who received radical resection of their tumor(s were enrolled. Preoperative sera were measured for IL-8 by ELISA. Expression of IL-8 and the infiltration of immune cells in tumor tissues were analyzed by an immunohistochemical staining of surgical specimens. RESULTS: We found that disease-free survival (DFS was significantly longer in the Stage I/II OSCC patients with low serum IL-8 levels compared to those with high levels (p = 0.001. The tumor expression of IL-8, i.e., IL-8(T and the density of CD163-positive cells in the tumor invasive front, i.e., CD163(IF were correlated with the serum IL-8 level (p = 0.033 and p = 0.038, respectively, and they were associated with poor clinical outcome (p = 0.007 and p = 0.002, respectively, in DFS in all patients. A multivariate analysis revealed that N status, IL-8(T and CD163(IF significantly affected the DFS of the patients. Further analysis suggested that combination of N status with serum IL-8, IL-8(T or CD163(IF may be a new criterion for discriminating between OSCC patients at high and low risk for tumor relapse. Interestingly, the in vitro experiments demonstrated that IL-8 enhanced generation of CD163-positive M2 macrophages from peripheral blood monocytes, and that the cells produced IL-10. CONCLUSIONS: These findings indicate that IL-8 may be involved in poor clinical outcomes via generation of CD163-positive M2 macrophages, and that these factors in addition to N status may have prognostic value in patients with resectable OSCSS.
Yin, Ke; Dou, Xiaomin; Ren, Fei; Chan, Wei-Ping; Chang, Victor Wei-Chung
2018-02-15
Bottom ashes generated from municipal solid waste incineration have gained increasing popularity as alternative construction materials, however, they contains elevated heavy metals posing a challenge for its free usage. Different leaching methods are developed to quantify leaching potential of incineration bottom ashes meanwhile guide its environmentally friendly application. Yet, there are diverse IBA applications while the in situ environment is always complicated, challenging its legislation. In this study, leaching tests were conveyed using batch and column leaching methods with seawater as opposed to deionized water, to unveil the metal leaching potential of IBA subjected to salty environment, which is commonly encountered when using IBA in land reclamation yet not well understood. Statistical analysis for different leaching methods suggested disparate performance between seawater and deionized water primarily ascribed to ionic strength. Impacts of leachant are metal-specific dependent on leaching methods and have a function of intrinsic characteristics of incineration bottom ashes. Leaching performances were further compared on additional perspectives, e.g. leaching approach and liquid to solid ratio, indicating sophisticated leaching potentials dominated by combined geochemistry. It is necessary to develop application-oriented leaching methods with corresponding leaching criteria to preclude discriminations between different applications, e.g., terrestrial applications vs. land reclamation. Copyright © 2017 Elsevier B.V. All rights reserved.
Boonen, Bert; Schotanus, Martijn G M; Kerens, Bart; Hulsmans, Frans-Jan; Tuinebreijer, Wim E; Kort, Nanne P
2017-09-01
To assess whether there is a significant difference between the alignment of the individual femoral and tibial components (in the frontal, sagittal and horizontal planes) as calculated pre-operatively (digital plan) and the actually achieved alignment in vivo obtained with the use of patient-specific positioning guides (PSPGs) for TKA. It was hypothesised that there would be no difference between post-op implant position and pre-op digital plan. Twenty-six patients were included in this non-inferiority trial. Software permitted matching of the pre-operative MRI scan (and therefore calculated prosthesis position) to a pre-operative CT scan and then to a post-operative full-leg CT scan to determine deviations from pre-op planning in all three anatomical planes. For the femoral component, mean absolute deviations from planning were 1.8° (SD 1.3), 2.5° (SD 1.6) and 1.6° (SD 1.4) in the frontal, sagittal and transverse planes, respectively. For the tibial component, mean absolute deviations from planning were 1.7° (SD 1.2), 1.7° (SD 1.5) and 3.2° (SD 3.6) in the frontal, sagittal and transverse planes, respectively. Absolute mean deviation from planned mechanical axis was 1.9°. The a priori specified null hypothesis for equivalence testing: the difference from planning is >3 or plan in all planes, except for the tibial rotation in the transverse plane. Possible explanations for outliers are discussed and highlight the importance for adequate training surgeons before they start using PSPG in their day-by-day practise. Prospective cohort study, Level II.
Li, Ying; Yang, Le; Liu, Chang; Hou, Qinghua; Jin, Peng; Lu, Xing
2018-05-29
Endohedral metallofullerenes (EMFs) containing actinides are rather intriguing due to potential 5f-orbital participation in the metal-metal or metal-cage bonding. In this work, density functional theory calculations first characterized the structure of recently synthesized ThC 74 as Th@ D 3 h (14246)-C 74 . We found that the thorium atom adopts an unusual off-axis position inside cage due to small metal ion size and the requirement of large coordination number, which phenomenon was further extended to other Th-based EMFs. Significantly, besides the strong metal-cage electrostatic attractions, topological and orbital analysis revealed that all the investigated Th-based EMFs exhibit obvious covalent interactions between metal and cage with substantial contribution from the Th 5f orbitals. The encapsulation by fullerenes is thus proposed as a practical pathway toward the f-orbital covalency for thorium. Interestingly, the anomalous internal position of Th led to a novel three-dimensional metal trajectory at elevated temperatures in the D 3 h -C 74 cavity, as elucidated by the static computations and molecular dynamic simulations.
Hsu, Tung-Shin; McPherron, R. L.
2002-11-01
An outstanding problem in magnetospheric physics is deciding whether substorms are always triggered by external changes in the interplanetary magnetic field (IMF) or solar wind plasma, or whether they sometimes occur spontaneously. Over the past decade, arguments have been made on both sides of this issue. In fact, there is considerable evidence that some substorms are triggered. However, equally persuasive examples of substorms with no obvious trigger have been found. Because of conflicting views on this subject, further work is required to determine whether there is a physical relation between IMF triggers and substorm onset. In the work reported here a list of substorm onsets was created using two independent substorm signatures: sudden changes in the slope of the AL index and the start of a Pi 2 pulsation burst. Possible IMF triggers were determined from ISEE-2 observations. With the ISEE spacecraft near local noon immediately upstream of the bow shock, there can be little question about propagation delay to the magnetopause or whether a particular IMF feature hits the subsolar magnetopause. Thus it eliminates the objections that the calculated arrival time is subject to a large error or that the solar wind monitor missed a potential trigger incident at the subsolar point. Using a less familiar technique, statistics of point process, we find that the time delay between substorm onsets and the propagated arrival time of IMF triggers are clustered around zero. We estimate for independent processes that the probability of this clustering by chance alone is about 10-11. If we take into account the requirement that the IMF must have been southward prior to the onset, then the probability of clustering is higher, ˜10-5, but still extremely unlikely. Thus it is not possible to ascribe the apparent relation between IMF northward turnings and substorm onset to coincidence.
Wakefield, Jerome C; First, Michael B
2012-02-01
The Diagnostic and Statistical Manual of Mental Disorders (DSM) definition of mental disorder requires that symptoms be caused by a dysfunction in the individual; when dysfunction is absent, symptoms represent normal-range distress or eccentricity and, if diagnosed as a mental disorder, are false positives. We hypothesized that because of psychiatry's lack of direct laboratory tests to distinguish dysfunction from normal-range distress, the context in which symptoms occur (eg, lack of imminent danger in a panic attack) is often essential to determining whether symptoms are caused by a dysfunction. If this is right, then the DSM diagnostic criteria should include many contextual criteria added to symptom syndromes to prevent dysfunction false positives. Despite their potential importance, such contextual criteria have not been previously reviewed. We, thus, systematically reviewed DSM categories to establish the extent of such uses of contextual criteria and created a typology of such uses. Of 111 sampled categories, 68 (61%) used context to prevent dysfunction false positives. Contextual criteria fell into 7 types: (1) exclusion of specific false-positive scenarios; (2) requiring that patients experience preconditions for normal responses (eg, requiring that individuals experience adequate sexual stimulation before being diagnosed with sexual dysfunctions); (3) requiring that symptoms be disproportionate relative to circumstances; (4) for childhood disorders, requiring that symptoms be developmentally inappropriate; (5) requiring that symptoms occur in multiple contexts; (6) requiring a substantial discrepancy between beliefs and reality; and (7) a residual category. Most DSM categories include contextual criteria to eliminate false-positive diagnoses and increase validity of descriptive criteria. Future revisions should systematically evaluate each category's need for contextual criteria. Copyright © 2012 Elsevier Inc. All rights reserved.
Harris, Joshua D; Brand, Jefferson C; Cote, Mark P; Dhawan, Aman
2017-08-01
Within the health care environment, there has been a recent and appropriate trend towards emphasizing the value of care provision. Reduced cost and higher quality improve the value of care. Quality is a challenging, heterogeneous, variably defined concept. At the core of quality is the patient's outcome, quantified by a vast assortment of subjective and objective outcome measures. There has been a recent evolution towards evidence-based medicine in health care, clearly elucidating the role of high-quality evidence across groups of patients and studies. Synthetic studies, such as systematic reviews and meta-analyses, are at the top of the evidence-based medicine hierarchy. Thus, these investigations may be the best potential source of guiding diagnostic, therapeutic, prognostic, and economic medical decision making. Systematic reviews critically appraise and synthesize the best available evidence to provide a conclusion statement (a "take-home point") in response to a specific answerable clinical question. A meta-analysis uses statistical methods to quantitatively combine data from single studies. Meta-analyses should be performed with high methodological quality homogenous studies (Level I or II) or evidence randomized studies, to minimize confounding variable bias. When it is known that the literature is inadequate or a recent systematic review has already been performed with a demonstration of insufficient data, then a new systematic review does not add anything meaningful to the literature. PROSPERO registration and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines assist authors in the design and conduct of systematic reviews and should always be used. Complete transparency of the conduct of the review permits reproducibility and improves fidelity of the conclusions. Pooling of data from overly dissimilar investigations should be avoided. This particularly applies to Level IV evidence, that is, noncomparative investigations
Wilson, Robert M.
2001-01-01
Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Rodriguez, Erika F; Reynolds, Jordan P; Jenkins, Sarah M; Winter, Stephanie M; Henry, Michael R; Nassar, Aziza
2012-01-01
Atypical squamous cells of undetermined significance (ASC-US) is a broad diagnostic category that could be attributed to human papillomavirus infection (HPV), malignant neoplasia and reactive conditions. We evaluated our institutional experience with ASC-US in women who are positive for high risk HPV (HRHPV+) by the Digene hybrid capture method from 2005-2009 to identify the risk of progression to squamous intraepithelial lesion (SIL) and cervical intraepithelial neoplasia (CIN) in association with age. We reviewed cytologic and follow-up surgical pathology reports for all specimens available. Progression was defined as a diagnosis of at least CINI on follow-up biopsy or resection or SIL on cytology. We identified 2613 cases and follow-up was available in 1839 (70.4%). Of these 74.2% had just one follow-up, 16.2% had a total of 2 follow-ups, 5.3% had a total of 3 follow-ups, and the remaining had as many as 6 follow-ups. Among the 1839 patients, 69.4% were age 30 or younger, 16.0% were between 31 to 40, 9.0% were between 41 to 50, and 5.6% were 51 or older. Among these, 25-30% progressed to dysplasia. The risk of progression varied by age (p=0.04) and was lowest among women between the ages of 41-50. Our findings highlight the importance of continued cytologic follow-up in women with HRHPV+ ASC-US in order to detect progression of disease, although the risk of progression is age dependent.
Energy Technology Data Exchange (ETDEWEB)
Seki, Masahiro; Kikuchi, Tomiichi [Fukushima Medical Coll., Matsuoka (Japan)
1995-06-01
In order to estimate the ralationship between the position of dorsal root ganglia (DRG) and radicular symptoms, anatomical study was done on 81 cadavers, and a clinical study with MRI was done on 20 cases of lumbar disc herniation and 20 of lumbar spondylosis with L{sub 5} radiculopathy. The position of DRG is not related to the occurrence of radicular symptoms in disc herniation, while in lumbar spondylosis proximally placed DRG are related to both of unilateral and bilateral occurrence of redicular symptoms. Unilateral occurrence of radicular symptoms is influenced by surrounding tissues of the nerve root, rather than the position of DRG. (author).
International Nuclear Information System (INIS)
Seki, Masahiro; Kikuchi, Tomiichi
1995-01-01
In order to estimate the ralationship between the position of dorsal root ganglia (DRG) and radicular symptoms, anatomical study was done on 81 cadavers, and a clinical study with MRI was done on 20 cases of lumbar disc herniation and 20 of lumbar spondylosis with L 5 radiculopathy. The position of DRG is not related to the occurrence of radicular symptoms in disc herniation, while in lumbar spondylosis proximally placed DRG are related to both of unilateral and bilateral occurrence of redicular symptoms. Unilateral occurrence of radicular symptoms is influenced by surrounding tissues of the nerve root, rather than the position of DRG. (author)
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Directory of Open Access Journals (Sweden)
Fuminori Sakurai
2016-01-01
Full Text Available Circulating tumor cells (CTCs are promising biomarkers in several cancers, and thus methods and apparatuses for their detection and quantification in the blood have been actively pursued. A novel CTC detection system using a green fluorescence protein (GFP–expressing conditionally replicating adenovirus (Ad (rAd-GFP was recently developed; however, there is concern about the production of false-positive cells (GFP-positive normal blood cells when using rAd-GFP, particularly at high titers. In addition, CTCs lacking or expressing low levels of coxsackievirus–adenovirus receptor (CAR cannot be detected by rAd-GFP, because rAd-GFP is constructed based on Ad serotype 5, which recognizes CAR. In order to suppress the production of false-positive cells, sequences perfectly complementary to blood cell–specific microRNA, miR-142-3p, were incorporated into the 3′-untranslated region of the E1B and GFP genes. In addition, the fiber protein was replaced with that of Ad serotype 35, which recognizes human CD46, creating rAdF35-142T-GFP. rAdF35-142T-GFP efficiently labeled not only CAR-positive tumor cells but also CAR-negative tumor cells with GFP. The numbers of false-positive cells were dramatically lower for rAdF35-142T-GFP than for rAd-GFP. CTCs in the blood of cancer patients were detected by rAdF35-142T-GFP with a large reduction in false-positive cells.
DEFF Research Database (Denmark)
Christensen, Niels Juel; Schultz-Larsen, K
1994-01-01
in a comprehensive medical examination. INTERVENTIONS. Plasma NA and A were measured in blood samples collected after the subjects had rested in the supine position for 15 min. The subjects have now been followed for 7 years. MAIN OUTCOME MEASURES. Seven years later, 115 men and 63 women had died. RESULTS. Cox...... of physical working capacity was included in the Cox regression analysis, both plasma NA and plasma A became insignificant, whereas a strong positive correlation appeared between physical working capacity and survival (P
Zuiderveld, Elise G; den Hartog, Laurens; Vissink, Arjan; Raghoebar, Gerry M; Meijer, Henny J A
2014-01-01
This study assessed whether buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation affects the level of the midbuccal mucosa (MBM). Ninety patients with a single-tooth implant in the esthetic zone were included. The level of the MBM was measured on photographs
Relationship of condylar position to disc position and morphology
Energy Technology Data Exchange (ETDEWEB)
Incesu, L.; Taskaya-Yilmaz, N. E-mail: nergizy@omu.edu.tr; Oeguetcen-Toller, M.; Uzun, E
2004-09-01
Introduction/objective: The purpose of this study was to assess whether condylar position, as depicted by magnetic resonance imaging, was an indicator of disc morphology and position. Methods and material: One hundred and twenty two TMJs of 61 patients with temporomandibular joint disorder were examined. Condylar position, disc deformity and degree of anterior disc displacement were evaluated by using magnetic resonance imaging. Results and discussion: Posterior condyle position was found to be the main feature of temporomandibular joints with slight and moderate anterior disc displacement. No statistical significance was found between the condylar position, and reducing and nonreducing disc positions. On the other hand, superior disc position was found to be statistically significant for centric condylar position. Conclusion: It was concluded that posterior condyle position could indicate anterior disc displacement whereas there was no relation between the position of condyle and the disc deformity.
International Nuclear Information System (INIS)
Hoffman, Rex; Welton, Mark L.; Klencke, Barbara; Weinberg, Vivian; Krieg, Richard
1999-01-01
Purpose: To assess the outcome and tolerance of HIV-positive patients with anal cancer to standard therapy based on their pretreatment CD4 count. Methods and Materials: Between 1991 and 1997, 17 HIV-positive patients with anal cancer and documented pretreatment CD4 counts were treated at the University of California, San Francisco or its affiliated hospitals with either concurrent chemotherapy and radiation or radiation alone. The outcome and complications of treatment were correlated with the patients' pretreatment CD4 count. Results: Disease for all 9 patients with pretreatment CD4 counts ≥ 200 was controlled with chemoradiation. Although four required a treatment break of 2 weeks because of toxicity, none required hospitalization. Of the 8 patients with pretreatment CD4 counts < 200, 4 experienced decreased counts, intractable diarrhea, or moist desquamation requiring hospitalization. Additionally, 4 of these 8 ultimately required a colostomy either for a therapy-related complication or for salvage. Nevertheless, 6/7 in this group who received concurrent chemotherapy and radiation had their disease controlled, whereas the patient treated with radiation alone failed and required a colostomy for salvage. Conclusion: Patients with CD4 ≥ 200 had excellent disease control with acceptable morbidity. Patients with CD4 < 200 had markedly increased morbidity; however, disease was ultimately controlled in 7/8 patients
De Vreese, L P; Gomiero, T; Uberti, M; De Bastiani, E; Weger, E; Mantesso, U; Marangoni, A
2015-04-01
(a) A psychometric validation of an Italian version of the Alzheimer's Functional Assessment Tool scale (AFAST-I), designed for informant-based assessment of the degree of impairment and of assistance required in seven basic daily activities in adult/elderly people with intellectual disabilities (ID) and (suspected) dementia; (b) a pilot analysis of its clinical significance with traditional statistical procedures and with an artificial neural network. AFAST-I was administered to the professional caregivers of 61 adults/seniors with ID with a mean age (± SD) of 53.4 (± 7.7) years (36% with Down syndrome). Internal consistency (Cronbach's α coefficient), inter/intra-rater reliabilities (intra-class coefficients, ICC) and concurrent, convergent and discriminant validity (Pearson's r coefficients) were computed. Clinical significance was probed by analysing the relationships among AFAST-I scores and the Sum of Cognitive Scores (SCS) and the Sum of Social Scores (SOS) of the Dementia Questionnaire for Persons with Intellectual Disabilities (DMR-I) after standardisation of their raw scores in equivalent scores (ES). An adaptive artificial system (AutoContractive Maps, AutoCM) was applied to all the variables recorded in the study sample, aimed at uncovering which variable occupies a central position and supports the entire network made up of the remaining variables interconnected among themselves with different weights. AFAST-I shows a high level of internal homogeneity with a Cronbach's α coefficient of 0.92. Inter-rater and intra-rater reliabilities were also excellent with ICC correlations of 0.96 and 0.93, respectively. The results of the analyses of the different AFAST-I validities all go in the expected direction: concurrent validity (r=-0.87 with ADL); convergent validity (r=0.63 with SCS; r=0.61 with SOS); discriminant validity (r=0.21 with the frequency of occurrence of dementia-related Behavioral Excesses of the Assessment for Adults with Developmental
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Diamantopoulos, Panagiotis T; Sofotasiou, Maria; Georgoussi, Zafiroula; Giannakopoulou, Nefeli; Papadopoulou, Vasiliki; Galanopoulos, Athanasios; Kontandreopoulou, Elina; Zervakis, Panagiotis; Pallaki, Paschalina; Kalala, Fani; Kyrtsonis, Marie-Christine; Dimitrakopoulou, Aglaia; Vassilakopoulos, Theodoros; Angelopoulou, Maria; Spanakis, Nikolaos; Viniou, Nora-Athina
2016-09-01
Signal transducer and activator of transcription (STAT) proteins have been intensively studied in hematologic malignancies, and the efficacy of agents against STATs in lymphomas is already under research. We investigated the expression of total STAT5 and STAT5b in peripheral blood samples of patients with chronic lymphocytic leukemia (CLL) in correlation with the presence of Epstein-Barr Virus (EBV) and its major oncoprotein (latent membrane protein 1, LMP1). The EBV load was measured in the peripheral blood by real-time PCR for the BXLF1 gene and the levels of LMP1 by PCR and ELISA. Western blotting was performed for total STAT5 and STAT5b in protein extracts. STAT5b was only expressed in patients (not in healthy subjects) and STAT5 but particularly STAT5b expression was correlated with the presence of the virus (77.3% vs. 51.2%, P = 0.006 for STAT5b) and to the expression of LMP1 (58.3% vs. 21.6%, P = 0.011 for STAT5b). Moreover, the expression of STAT5b and the presence of EBV and LMP1 were strongly negatively correlated with the overall survival of the patients (log-rank test P = 0.011, 0.015, 0.006, respectively). Double positive (for EBV and STAT5b) patients had the lowest overall survival (log-rank test P = 0.013). This is the first report of a survival disadvantage of EBV+ patients with CLL, and the first time that STAT5b expression is correlated with survival. The correlation of STAT5 expression with the presence of the virus, along with our survival correlations defines a subgroup of patients with CLL that may benefit from anti-STAT agents. © 2016 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.
Fekete, Z; Muntean, A; Irimie, A; Hica, S; Resiga, L; Todor, N; Nagy, V
2013-01-01
The aim of this study was to analyze the characteristics of patients with rectal cancer operated with a microscopic positive margin (R1) and thus avoid these situations or adapt treatment in these particular cases. We reviewed all the pathology data of resected specimens from patients with rectal or recto-sigmoid cancer operated with curative intent at the Institute of Oncology "Prof. Dr. Ion Chiricuta" between 2000-2011 (763 patients in 12 years) and the pathology files of patients from other institutions referred for adjuvant treatment to our hospital (318 patients). We included patients with anterior resection, Hartmann's procedure and abdomino-perineal resection, but we excluded patients with local excision and patients with R2/R1 at first, but R0 after re-resection (56 patients). We have identified 31 patients with R1, but had to exclude one case from analysis because this patient was lost to follow-up. With surgery alone the local relapse (LR) was unavoidable. In the neoadjuvant chemoradiation (CRT) group 85.7% of the patients did not develop LR despite of R1. In the adjuvant CRT cohort 50% of the patients were LR-free at 2 years after conventional radiotherapy (p<0.01). Based on these results it is concluded that a clear resection margin is extremely important for the local control of rectal cancer, because it cannot be always compensated by adjuvant CRT. In R1 cases neoadjuvant CRT seems to offer better prognosis than adjuvant CRT. To avoid R1 and its consequences a good quality control of total mesorectal excision (TME) is needed and CRT should be done before and not after surgery. R1 after primary surgery needs to be compensated by re-resection if possible, otherwise probably high dose radiotherapy with chemotherapy is needed.
Energy Technology Data Exchange (ETDEWEB)
Kyle, Jennifer E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Casey, Cameron P. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Stratton, Kelly G. [National Security Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zink, Erika M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Kim, Young-Mo [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zheng, Xueyun [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Monroe, Matthew E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Weitz, Karl K. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Bloodsworth, Kent J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Orton, Daniel J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Ibrahim, Yehia M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Moore, Ronald J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Lee, Christine G. [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Research Service, Portland Veterans Affairs Medical Center, Portland OR USA; Pedersen, Catherine [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Orwoll, Eric [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Smith, Richard D. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Burnum-Johnson, Kristin E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Baker, Erin S. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA
2017-02-05
The use of dried blood spots (DBS) has many advantages over traditional plasma and serum samples such as smaller blood volume required, storage at room temperature, and ability for sampling in remote locations. However, understanding the robustness of different analytes in DBS samples is essential, especially in older samples collected for longitudinal studies. Here we analyzed DBS samples collected in 2000-2001 and stored at room temperature and compared them to matched serum samples stored at -80°C to determine if they could be effectively used as specific time points in a longitudinal study following metabolic disease. Four hundred small molecules were identified in both the serum and DBS samples using gas chromatograph-mass spectrometry (GC-MS), liquid chromatography-MS (LC-MS) and LC-ion mobility spectrometry-MS (LC-IMS-MS). The identified polar metabolites overlapped well between the sample types, though only one statistically significant polar metabolite in a case-control study was conserved, indicating degradation occurs in the DBS samples affecting quantitation. Differences in the lipid identifications indicated that some oxidation occurs in the DBS samples. However, thirty-six statistically significant lipids correlated in both sample types indicating that lipid quantitation was more stable across the sample types.
Directory of Open Access Journals (Sweden)
Madelaine Sarria Castro
2004-05-01
Full Text Available OBJETIVOS: Caracterizar el empleo de las pruebas convencionales de significación estadística y las tendencias actuales que muestra su uso en tres revistas biomédicas del ámbito hispanohablante. MÉTODOS: Se examinaron todos los artículos originales descriptivos o explicativos que fueron publicados en el quinquenio de 19962000 en tres publicaciones: Revista Cubana de Medicina General Integral, Revista Panamericana de Salud Pública/Pan American Journal of Public Health y Medicina Clínica. RESULTADOS: En las tres revistas examinadas se detectaron diversos rasgos criticables en el empleo de las pruebas de hipótesis basadas en los "valores P" y la escasa presencia de las nuevas tendencias que se proponen en su lugar: intervalos de confianza (IC e inferencia bayesiana. Los hallazgos fundamentales fueron los siguientes: mínima presencia de los IC, ya fuese como complemento de las pruebas de significación o como recurso estadístico único; mención del tamaño muestral como posible explicación de los resultados; predominio del empleo de valores rígidos de alfa; falta de uniformidad en la presentación de los resultados, y alusión indebida en las conclusiones de la investigación a los resultados de las pruebas de hipótesis. CONCLUSIONES: Los resultados reflejan la falta de acatamiento de autores y editores en relación con las normas aceptadas en torno al uso de las pruebas de significación estadística y apuntan a que el empleo adocenado de estas pruebas sigue ocupando un espacio importante en la literatura biomédica del ámbito hispanohablante.OBJECTIVE: To describe the use of conventional tests of statistical significance and the current trends shown by their use in three biomedical journals read in Spanish-speaking countries. METHODS: All descriptive or explanatory original articles published in the five-year period of 1996 through 2000 were reviewed in three journals: Revista Cubana de Medicina General Integral [Cuban Journal of
Doctrinal gnosis in Islam: Position and significance
Directory of Open Access Journals (Sweden)
Halilović Tehran
2014-01-01
Full Text Available Empirical, rational and intuitive methods in scientific considerations formulate various scientific disciplines. Doctrinal gnosis and philosophy are two rational-demonstrative doctrines that consider ontological questions. Although the basis of their Gnostic experience is intuitive knowledge, Gnostics describe and explain their experience in a rational way, with a view to making the content of such Gnostic knowledge in their scientific work available to the general public. In that way, doctrinal gnosis is established which, like philosophy, uses general terms. To get a clear picture of the essence of doctrinal gnosis in Islam, in this paper we will try to present the main characteristics of doctrinal gnosis and its relationship with similar disciplines. On this front, in addition to the importance of clarifying the differences between doctrinal gnosis and other aspects of the Gnostic knowledge in Islam, such as transmitted gnosis and practical gnosis, special attention will have to be paid to a clear analysis of the differences between doctrinal gnosis and philosophy.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
International Nuclear Information System (INIS)
Kay, L.
1992-01-01
If two optical images of the same scene are obtained using a charged-coupled device (CCD), a third image (called the lesser image) may be formed in computer memory by taking the lesser of the two counts in each pixel. The process may be used to remove, or greatly reduce, the effect of spurious events such as cosmic rays. A complete statistical theory of the lesser image is given for the general case, thereby facilitating recovery of the true image from the lesser image. (author)
[Comment on] Statistical discrimination
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
International Nuclear Information System (INIS)
Guo, Junfeng; Newell, John D.; Wang, Chao; Chan, Kung-Sik; Jin, Dakai; Saha, Punam K.; Sieren, Jered P.; Barr, R. G.; Han, MeiLan K.; Kazerooni, Ella; Cooper, Christopher B.; Couper, David; Hoffman, Eric A.
2016-01-01
scans and demonstrated a significant reduction in measurement variation associated with the test object. Conclusions: Three operator errors were identified which significantly affected the usability of the acquired scan images of the test object used for monitoring scanner stability in a multicenter study. The authors’ results demonstrated that at the time of test object scan receipt at a radiology core laboratory, quality control procedures should include an assessment of tilt index, water bottle offset, and air bubble size within the water bottle. Application of this methodology to 2272 SPIROMICS scans indicated that their findings were not limited to the scanner make and model used for the initial test but was generalizable to both Siemens and GE scanners which comprise the scanner types used within the SPIROMICS study.
Energy Technology Data Exchange (ETDEWEB)
Guo, Junfeng; Newell, John D. [Departments of Radiology and Biomedical Engineering, University of Iowa, Iowa City, Iowa 52242 (United States); Wang, Chao; Chan, Kung-Sik [Department of Statistics and Actuarial Science, University of Iowa, Iowa City, Iowa 52242 (United States); Jin, Dakai; Saha, Punam K. [Department of Electrical and Computer Engineering, University of Iowa, Iowa City, Iowa 52242 (United States); Sieren, Jered P. [Department of Radiology, University of Iowa, Iowa City, Iowa 52242 (United States); Barr, R. G. [Departments of Medicine and Epidemiology, Columbia University Medical Center, New York, New York 10032 (United States); Han, MeiLan K. [Department of Medicine, Division of Pulmonary and Critical Care Medicine, University of Michigan, Ann Arbor, Michigan 48109 (United States); Kazerooni, Ella [Department of Radiology, University of Michigan, Ann Arbor, Michigan 48109 (United States); Cooper, Christopher B. [Department of Medicine, University of California, Los Angeles, California 90095 (United States); Couper, David [Department of Biostatistics, University of North Carolina, Chapel Hill, North Carolina 27599 (United States); Hoffman, Eric A., E-mail: eric-hoffman@uiowa.edu [Departments of Radiology, Medicine and Biomedical Engineering, University of Iowa, Iowa City, Iowa 52242 (United States)
2016-05-15
scans and demonstrated a significant reduction in measurement variation associated with the test object. Conclusions: Three operator errors were identified which significantly affected the usability of the acquired scan images of the test object used for monitoring scanner stability in a multicenter study. The authors’ results demonstrated that at the time of test object scan receipt at a radiology core laboratory, quality control procedures should include an assessment of tilt index, water bottle offset, and air bubble size within the water bottle. Application of this methodology to 2272 SPIROMICS scans indicated that their findings were not limited to the scanner make and model used for the initial test but was generalizable to both Siemens and GE scanners which comprise the scanner types used within the SPIROMICS study.
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
2009-04-01
The Global Positioning System (GPS), which provides positioning, navigation, and timing data to users worldwide, has become essential to U.S. national security and a key tool in an expanding array of public service and commercial applications at home...
Directory of Open Access Journals (Sweden)
Sreeram V Ramagopalan
2015-04-01
Full Text Available Background: We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. Methods: A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. Findings: 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3% had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome. Conclusions: Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.
Nasution, Inggrita Gusti Sari; Muchtar, Yasmin Chairunnisa
2013-01-01
This research is to study the factors which influence the business success of small business ‘processed rotan’. The data employed in the study are primary data within the period of July to August 2013, 30 research observations through census method. Method of analysis used in the study is multiple linear regressions. The results of analysis showed that the factors of labor, innovation and promotion have positive and significant influence on the business success of small busine...
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Scandolera, Amandine; Hubert, Jane; Humeau, Anne; Lambert, Carole; De Bizemont, Audrey; Winkel, Chris; Kaouas, Abdelmajid; Renault, Jean-Hugues; Reynaud, Romain
2018-01-01
The aim of the present study was to investigate the neuro-soothing activity of a water-soluble hydrolysate obtained from the red microalgae Rhodosorus marinus Geitler (Stylonemataceae). Transcriptomic analysis performed on ≈100 genes related to skin biological functions firstly revealed that the crude Rhodosorus marinus extract was able to significantly negatively modulate specific genes involved in pro-inflammation (interleukin 1α encoding gene, IL1A) and pain detection related to tissue inflammation (nerve growth factor NGF and its receptor NGFR). An in vitro model of normal human keratinocytes was then used to evaluate the ability of the Rhodosorus marinus extract to control the release of neuro-inflammation mediators under phorbol myristate acetate (PMA)-induced inflammatory conditions. The extract incorporated at 1% and 3% significantly inhibited the release of IL-1α and NGF secretion. These results were confirmed in a co-culture system of reconstructed human epithelium and normal human epidermal keratinocytes on which a cream formulated with the Rhodosorus marinus extract at 1% and 3% was topically applied after systemic induction of neuro-inflammation. Finally, an in vitro model of normal human astrocytes was developed for the evaluation of transient receptor potential vanilloid 1 (TRPV1) receptor modulation, mimicking pain sensing related to neuro-inflammation as observed in sensitive skins. Treatment with the Rhodosorus marinus extract at 1% and 3% significantly decreased PMA-mediated TRPV1 over-expression. In parallel with these biological experiments, the crude Rhodosorus marinus extract was fractionated by centrifugal partition chromatography (CPC) and chemically profiled by a recently developed 13C NMR-based dereplication method. The CPC-generated fractions as well as pure metabolites were tested again in vitro in an attempt to identify the biologically active constituents involved in the neuro-soothing activity of the Rhodosorus marinus extract
Directory of Open Access Journals (Sweden)
Amandine Scandolera
2018-03-01
Full Text Available The aim of the present study was to investigate the neuro-soothing activity of a water-soluble hydrolysate obtained from the red microalgae Rhodosorus marinus Geitler (Stylonemataceae. Transcriptomic analysis performed on ≈100 genes related to skin biological functions firstly revealed that the crude Rhodosorus marinus extract was able to significantly negatively modulate specific genes involved in pro-inflammation (interleukin 1α encoding gene, IL1A and pain detection related to tissue inflammation (nerve growth factor NGF and its receptor NGFR. An in vitro model of normal human keratinocytes was then used to evaluate the ability of the Rhodosorus marinus extract to control the release of neuro-inflammation mediators under phorbol myristate acetate (PMA-induced inflammatory conditions. The extract incorporated at 1% and 3% significantly inhibited the release of IL-1α and NGF secretion. These results were confirmed in a co-culture system of reconstructed human epithelium and normal human epidermal keratinocytes on which a cream formulated with the Rhodosorus marinus extract at 1% and 3% was topically applied after systemic induction of neuro-inflammation. Finally, an in vitro model of normal human astrocytes was developed for the evaluation of transient receptor potential vanilloid 1 (TRPV1 receptor modulation, mimicking pain sensing related to neuro-inflammation as observed in sensitive skins. Treatment with the Rhodosorus marinus extract at 1% and 3% significantly decreased PMA-mediated TRPV1 over-expression. In parallel with these biological experiments, the crude Rhodosorus marinus extract was fractionated by centrifugal partition chromatography (CPC and chemically profiled by a recently developed 13C NMR-based dereplication method. The CPC-generated fractions as well as pure metabolites were tested again in vitro in an attempt to identify the biologically active constituents involved in the neuro-soothing activity of the Rhodosorus
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.
Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia
2012-11-23
In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.
Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring
2012-01-01
Background In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students’ attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students’ attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students’ achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. Methods A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics −28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Results Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. Conclusions The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students’ attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes. PMID:23173770
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Intervention for Maltreating Fathers: Statistically and Clinically Significant Change
Scott, Katreena L.; Lishak, Vicky
2012-01-01
Objective: Fathers are seldom the focus of efforts to address child maltreatment and little is currently known about the effectiveness of intervention for this population. To address this gap, we examined the efficacy of a community-based group treatment program for fathers who had abused or neglected their children or exposed their children to…
The questioned p value: clinical, practical and statistical significance
Directory of Open Access Journals (Sweden)
Rosa Jiménez-Paneque
2016-09-01
Full Text Available Resumen El uso del valor de p y la significación estadística han estado en entredicho desde principios de la década de los 80 en el siglo pasado hasta nuestros días. Mucho se ha discutido al respecto en el ámbito de la estadística y sus aplicaciones, en particular a la Epidemiología y la Salud Pública. El valor de p y su equivalente, la significación estadística, son por demás conceptos difíciles de asimilar para los muchos profesionales de la salud involucrados de alguna manera en la investigación aplicada a sus áreas de trabajo. Sin embargo, su significado debería ser claro en términos intuitivos a pesar de que se basa en conceptos teóricos del terreno de la Estadística-Matemática. Este artículo intenta presentar al valor de p como un concepto que se aplica a la vida diaria y por tanto intuitivamente sencillo pero cuyo uso adecuado no se puede separar de elementos teóricos y metodológicos con complejidad intrínseca. Se explican también de manera intuitiva las razones detrás de las críticas que ha recibido el valor de p y su uso aislado, principalmente la necesidad de deslindar significación estadística de significación clínica y se mencionan algunos de los remedios propuestos para estos problemas. Se termina aludiendo a la actual tendencia a reivindicar su uso apelando a la conveniencia de utilizarlo en ciertas situaciones y la reciente declaración de la Asociación Americana de Estadística al respecto.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance
Kramer, Karen L.; Veile, Amanda; Ot?rola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger s...
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
International Nuclear Information System (INIS)
Paglia, Gianluca; Rohl, Andrew L.; Gale, Julian D.; Buckley, Craig E.
2005-01-01
We have performed an extensive computational study of γ-Al 2 O 3 , beginning with the geometric analysis of approximately 1.47 billion spinel-based structural candidates, followed by derivative method energy minimization calculations of approximately 122 000 structures. Optimization of the spinel-based structural models demonstrated that structures exhibiting nonspinel site occupancy after simulation were more energetically favorable, as suggested in other computational studies. More importantly, none of the spinel structures exhibited simulated diffraction patterns that were characteristic of γ-Al 2 O 3 . This suggests that cations of γ-Al 2 O 3 are not exclusively held in spinel positions, that the spinel model of γ-Al 2 O 3 does not accurately reflect its structure, and that a representative structure cannot be achieved from molecular modeling when the spinel representation is used as the starting structure. The latter two of these three findings are extremely important when trying to accurately model the structure. A second set of starting models were generated with a large number of cations occupying c symmetry positions, based on the findings from recent experiments. Optimization of the new c symmetry-based structural models resulted in simulated diffraction patterns that were characteristic of γ-Al 2 O 3 . The modeling, conducted using supercells, yields a more accurate and complete determination of the defect structure of γ-Al 2 O 3 than can be achieved with current experimental techniques. The results show that on average over 40% of the cations in the structure occupy nonspinel positions, and approximately two-thirds of these occupy c symmetry positions. The structures exhibit variable occupancy in the site positions that follow local symmetry exclusion rules. This variation was predominantly represented by a migration of cations away from a symmetry positions to other tetrahedral site positions during optimization which were found not to affect the
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Directory of Open Access Journals (Sweden)
Joachim I. Krueger
2018-04-01
Full Text Available The practice of Significance Testing (ST remains widespread in psychological science despite continual criticism of its flaws and abuses. Using simulation experiments, we address four concerns about ST and for two of these we compare ST’s performance with prominent alternatives. We find the following: First, the 'p' values delivered by ST predict the posterior probability of the tested hypothesis well under many research conditions. Second, low 'p' values support inductive inferences because they are most likely to occur when the tested hypothesis is false. Third, 'p' values track likelihood ratios without raising the uncertainties of relative inference. Fourth, 'p' values predict the replicability of research findings better than confidence intervals do. Given these results, we conclude that 'p' values may be used judiciously as a heuristic tool for inductive inference. Yet, 'p' values cannot bear the full burden of inference. We encourage researchers to be flexible in their selection and use of statistical methods.
Can a significance test be genuinely Bayesian?
Pereira, Carlos A. de B.; Stern, Julio Michael; Wechsler, Sergio
2008-01-01
The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.
Kobau, Rosemarie; Cui, Wanjun; Zack, Matthew M
2017-07-01
Healthy People 2020, a national health promotion initiative, calls for increasing the proportion of U.S. adults who self-report good or better health. The Patient-Reported Outcomes Measurement Information System (PROMIS) Global Health Scale (GHS) was identified as a reliable and valid set of items of self-reported physical and mental health to monitor these two domains across the decade. The purpose of this study was to examine the percentage of adults with an epilepsy history who met the Healthy People 2020 target for self-reported good or better health and to compare these percentages to adults with history of other common chronic conditions. Using the 2010 National Health Interview Survey, we compared and estimated the age-standardized prevalence of reporting good or better physical and mental health among adults with five selected chronic conditions including epilepsy, diabetes, heart disease, cancer, and hypertension. We examined response patterns for physical and mental health scale among adults with these five conditions. The percentages of adults with epilepsy who reported good or better physical health (52%) or mental health (54%) were significantly below the Healthy People 2020 target estimate of 80% for both outcomes. Significantly smaller percentages of adults with an epilepsy history reported good or better physical health than adults with heart disease, cancer, or hypertension. Significantly smaller percentages of adults with an epilepsy history reported good or better mental health than adults with all other four conditions. Health and social service providers can implement and enhance existing evidence-based clinical interventions and public health programs and strategies shown to improve outcomes in epilepsy. These estimates can be used to assess improvements in the Healthy People 2020 Health-Related Quality of Life and Well-Being Objective throughout the decade. Published by Elsevier Inc.
Whither Statistics Education Research?
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
Behavioral investment strategy matters: a statistical arbitrage approach
Sun, David; Tsai, Shih-Chuan; Wang, Wei
2011-01-01
In this study, we employ a statistical arbitrage approach to demonstrate that momentum investment strategy tend to work better in periods longer than six months, a result different from findings in past literature. Compared with standard parametric tests, the statistical arbitrage method produces more clearly that momentum strategies work only in longer formation and holding periods. Also they yield positive significant returns in an up market, but negative yet insignificant returns in a down...
Maccone, C.
In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Energy Technology Data Exchange (ETDEWEB)
NONE
2010-07-01
The IEA produced its first handy, pocket-sized summary of key energy data in 1997. This new edition responds to the enormously positive reaction to the book since then. Key World Energy Statistics produced by the IEA contains timely, clearly-presented data on supply, transformation and consumption of all major energy sources. The interested businessman, journalist or student will have at his or her fingertips the annual Canadian production of coal, the electricity consumption in Thailand, the price of diesel oil in Spain and thousands of other useful energy facts. It exists in different formats to suit our readers' requirements.
Suonio, S; Simpanen, A L; Olkkonen, H; Haring, P
1976-02-01
The placental blood flow was assessed by the 99mTc accumulation method in 10 normal pregnancies in the left lateral recumbent position accomplished by a 15 degree wedge and in the supine position. The postural change caused a 17% decrease in the mean placental accumulation rate, which was not statistically significant. Ten patients were moved from the left lateral recumbent position to the upright position, which caused a statistically significant 23% decrease in the mean accumulation rate. Other haemodynamic variables studied were the maternal heart rate and the systolic and diastolic blood pressures. The clinical significance of the haemodynamic changes produced by alterations in posture are briefly discussed.
Statistics for experimentalists
Cooper, B E
2014-01-01
Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Significance evaluation in factor graphs
DEFF Research Database (Denmark)
Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet
2017-01-01
in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Statistics Using Just One Formula
Rosenthal, Jeffrey S.
2018-01-01
This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…
Statistics Anxiety among Postgraduate Students
Koh, Denise; Zawi, Mohd Khairi
2014-01-01
Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…
Osman, Habab; Alahmadi, Maryam; Bakhiet, Salaheldin; Lynn, Richard
2016-12-01
Data are reported showing statistically significant negative correlations between intelligence and family size, position, and income in a sample of 604 adolescent girls in Saudi Arabia. There were no statistically significant correlations or associations between whether the mother or father were deceased or both parents were alive, and whether the parents were living together or were divorced. © The Author(s) 2016.
Official statistics and Big Data
Directory of Open Access Journals (Sweden)
Peter Struijs
2014-07-01
Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.
International Nuclear Information System (INIS)
Venkataraman, G.
1992-01-01
Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)
Social Security Administration — The Position Information Data Asset provides the ability to search for active SSA position descriptions using various search criteria. An individual may search by PD...
Statistical modeling of Earth's plasmasphere
Veibell, Victoir
The behavior of plasma near Earth's geosynchronous orbit is of vital importance to both satellite operators and magnetosphere modelers because it also has a significant influence on energy transport, ion composition, and induced currents. The system is highly complex in both time and space, making the forecasting of extreme space weather events difficult. This dissertation examines the behavior and statistical properties of plasma mass density near geosynchronous orbit by using both linear and nonlinear models, as well as epoch analyses, in an attempt to better understand the physical processes that precipitates and drives its variations. It is shown that while equatorial mass density does vary significantly on an hourly timescale when a drop in the disturbance time scale index ( Dst) was observed, it does not vary significantly between the day of a Dst event onset and the day immediately following. It is also shown that increases in equatorial mass density were not, on average, preceded or followed by any significant change in the examined solar wind or geomagnetic variables, including Dst, despite prior results that considered a few selected events and found a notable influence. It is verified that equatorial mass density and and solar activity via the F10.7 index have a strong correlation, which is stronger over longer timescales such as 27 days than it is over an hourly timescale. It is then shown that this connection seems to affect the behavior of equatorial mass density most during periods of strong solar activity leading to large mass density reactions to Dst drops for high values of F10.7. It is also shown that equatorial mass density behaves differently before and after events based on the value of F10.7 at the onset of an equatorial mass density event or a Dst event, and that a southward interplanetary magnetic field at onset leads to slowed mass density growth after event onset. These behavioral differences provide insight into how solar and geomagnetic
Peterson, Christopher
2009-01-01
Positive psychology is a deliberate correction to the focus of psychology on problems. Positive psychology does not deny the difficulties that people may experience but does suggest that sole attention to disorder leads to an incomplete view of the human condition. Positive psychologists concern themselves with four major topics: (1) positive…
Induction position for spinal anaesthesia: Sitting versus lateral position
International Nuclear Information System (INIS)
Shahzad, K.; Afshan, G.
2013-01-01
Objective: To compare the effect of induction position on block characteristics (sensory and motor nerves) and haemodynamic stability in elderly patients with isobaric bupivacaine. Patient comfort was also looked at. Methods: The randomized single blinded study was conducted at the Aga Khan University Hospital, Karachi, from September 2007 to August 2008. A total of 70 patients aged >60 years of both genders were included. Spinal anaesthesia was performed either in sitting or lateral position according to random allocation. Assessments of sensory, motor block and heart rate, systolic and diastolic blood pressure were recorded for 20 minutes. SPSS 16 was used for statistical analysis. Results: There was no significant difference for haemodynamic variables heart rate, systolic and diastolic blood pressure. The onset of anaesthesia was faster in the sitting group (4.5 minutes vs 5.4 minutes). The motor block characteristics were similar in both the groups. The majority of patients who reported 'very comfortable' for induction position belonged to the lateral group. Conclusion: Both sitting and lateral positions have similar effects on sensory and motor blockade and haemodynamic stability. However, patients generally found lateral position very comfortable. (author)
Mannings, Robin
2008-01-01
This groundbreaking resource offers a practical, in-depth understanding of Ubiquitous Positioning - positioning systems that identify the location and position of people, vehicles and objects in time and space in the digitized networked economy. The future and growth of ubiquitous positioning will be fueled by the convergence of many other areas of technology, from mobile telematics, Internet technology, and location systems, to sensing systems, geographic information systems, and the semantic web. This first-of-its-kind volume explores ubiquitous positioning from a convergence perspective, of
Statistical and theoretical research
International Nuclear Information System (INIS)
Anon.
1983-01-01
Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
International Nuclear Information System (INIS)
Robinson, M.T.
1993-01-01
The MARLOWE program was used to study the statistics of sputtering on the example of 1- to 100-keV Au atoms normally incident on static (001) and (111) Au crystals. The yield of sputtered atoms was examined as a function of the impact point of the incident particles (''ions'') on the target surfaces. There were variations on two scales. The effects of the axial and planar channeling of the ions could be traced, the details depending on the orientation of the target and the energies of the ions. Locally, the sputtering yield was very sensitive to the impact point, small changes in position often producing large changes yield. Results indicate strongly that the sputtering yield is a random (''chaotic'') function of the impact point
DEFF Research Database (Denmark)
Halkier, Bente; Keller, Margit
2014-01-01
positionings emerges based on empirical examples of research in parent–children consumption. Positionings are flexible discursive fixations of the relationship between the performances of the practitioner, other practitioners, media discourse and consumption activities. The basic positioning types...... are the practice maintenance and the practice change position, with different sorts of adapting in between. Media discourse can become a resource for a resistant position against social control or for an appropriating position in favour of space for action. Regardless of the current relation to a particular media......This article analyses the ways in which media discourses become a part of contested consumption activities. We apply a positioning perspective with practice theory to focus on how practitioners relate to media discourse as a symbolic resource in their everyday practices. A typology of performance...
Significance of thymic scintigraphy
International Nuclear Information System (INIS)
Baba, Hiromi; Oshiumi, Yoshihiko; Nakayama, Chikashi; Morita, Kazunori; Koga, Ichinari
1978-01-01
Thymic scintigraphy by 67 Ga-citrate and 75 Se-methionine was done on 6 cases of thymoma, and 5 cases of myasthenia gravis. Scan was positive on 5 of 6 cases of thymoma. All patients with malignant thymoma were positive. Among the 7 cases of myasthenia gravis, scintigrams revealed 2 thymomas and 1 hyperplasia on whom no thymic mass suspected. Thymic scintigraphy is useful examination when dealing with myasthenia gravis. (auth.)
Medical Statistics – Mathematics or Oracle? Farewell Lecture
Directory of Open Access Journals (Sweden)
Gaus, Wilhelm
2005-06-01
Full Text Available Certainty is rare in medicine. This is a direct consequence of the individuality of each and every human being and the reason why we need medical statistics. However, statistics have their pitfalls, too. Fig. 1 shows that the suicide rate peaks in youth, while in Fig. 2 the rate is highest in midlife and Fig. 3 in old age. Which of these contradictory messages is right? After an introduction to the principles of statistical testing, this lecture examines the probability with which statistical test results are correct. For this purpose the level of significance and the power of the test are compared with the sensitivity and specificity of a diagnostic procedure. The probability of obtaining correct statistical test results is the same as that for the positive and negative correctness of a diagnostic procedure and therefore depends on prevalence. The focus then shifts to the problem of multiple statistical testing. The lecture demonstrates that for each data set of reasonable size at least one test result proves to be significant - even if the data set is produced by a random number generator. It is extremely important that a hypothesis is generated independently from the data used for its testing. These considerations enable us to understand the gradation of "lame excuses, lies and statistics" and the difference between pure truth and the full truth. Finally, two historical oracles are cited.
Preventing statistical errors in scientific journals.
Nuijten, M.B.
2016-01-01
There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
Directory of Open Access Journals (Sweden)
Laura Badenes-Ribera
2018-06-01
Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not
Tibély, Gergely; Pollner, Péter; Vicsek, Tamás; Palla, Gergely
2012-05-01
Due to the increasing popularity of collaborative tagging systems, the research on tagged networks, hypergraphs, ontologies, folksonomies and other related concepts is becoming an important interdisciplinary area with great potential and relevance for practical applications. In most collaborative tagging systems the tagging by the users is completely ‘flat’, while in some cases they are allowed to define a shallow hierarchy for their own tags. However, usually no overall hierarchical organization of the tags is given, and one of the interesting challenges of this area is to provide an algorithm generating the ontology of the tags from the available data. In contrast, there are also other types of tagged networks available for research, where the tags are already organized into a directed acyclic graph (DAG), encapsulating the ‘is a sub-category of’ type of hierarchy between each other. In this paper, we study how this DAG affects the statistical distribution of tags on the nodes marked by the tags in various real networks. The motivation for this research was the fact that understanding the tagging based on a known hierarchy can help in revealing the hidden hierarchy of tags in collaborative tagging systems. We analyse the relation between the tag-frequency and the position of the tag in the DAG in two large sub-networks of the English Wikipedia and a protein-protein interaction network. We also study the tag co-occurrence statistics by introducing a two-dimensional (2D) tag-distance distribution preserving both the difference in the levels and the absolute distance in the DAG for the co-occurring pairs of tags. Our most interesting finding is that the local relevance of tags in the DAG (i.e. their rank or significance as characterized by, e.g., the length of the branches starting from them) is much more important than their global distance from the root. Furthermore, we also introduce a simple tagging model based on random walks on the DAG, capable of
International Nuclear Information System (INIS)
Tibély, Gergely; Vicsek, Tamás; Pollner, Péter; Palla, Gergely
2012-01-01
Due to the increasing popularity of collaborative tagging systems, the research on tagged networks, hypergraphs, ontologies, folksonomies and other related concepts is becoming an important interdisciplinary area with great potential and relevance for practical applications. In most collaborative tagging systems the tagging by the users is completely ‘flat’, while in some cases they are allowed to define a shallow hierarchy for their own tags. However, usually no overall hierarchical organization of the tags is given, and one of the interesting challenges of this area is to provide an algorithm generating the ontology of the tags from the available data. In contrast, there are also other types of tagged networks available for research, where the tags are already organized into a directed acyclic graph (DAG), encapsulating the ‘is a sub-category of’ type of hierarchy between each other. In this paper, we study how this DAG affects the statistical distribution of tags on the nodes marked by the tags in various real networks. The motivation for this research was the fact that understanding the tagging based on a known hierarchy can help in revealing the hidden hierarchy of tags in collaborative tagging systems. We analyse the relation between the tag-frequency and the position of the tag in the DAG in two large sub-networks of the English Wikipedia and a protein-protein interaction network. We also study the tag co-occurrence statistics by introducing a two-dimensional (2D) tag-distance distribution preserving both the difference in the levels and the absolute distance in the DAG for the co-occurring pairs of tags. Our most interesting finding is that the local relevance of tags in the DAG (i.e. their rank or significance as characterized by, e.g., the length of the branches starting from them) is much more important than their global distance from the root. Furthermore, we also introduce a simple tagging model based on random walks on the DAG, capable of
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Statistical analysis and data management
International Nuclear Information System (INIS)
Anon.
1981-01-01
This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found
Detecting Novelty and Significance
Ferrari, Vera; Bradley, Margaret M.; Codispoti, Maurizio; Lang, Peter J.
2013-01-01
Studies of cognition often use an “oddball” paradigm to study effects of stimulus novelty and significance on information processing. However, an oddball tends to be perceptually more novel than the standard, repeated stimulus as well as more relevant to the ongoing task, making it difficult to disentangle effects due to perceptual novelty and stimulus significance. In the current study, effects of perceptual novelty and significance on ERPs were assessed in a passive viewing context by presenting repeated and novel pictures (natural scenes) that either signaled significant information regarding the current context or not. A fronto-central N2 component was primarily affected by perceptual novelty, whereas a centro-parietal P3 component was modulated by both stimulus significance and novelty. The data support an interpretation that the N2 reflects perceptual fluency and is attenuated when a current stimulus matches an active memory representation and that the amplitude of the P3 reflects stimulus meaning and significance. PMID:19400680
DEFF Research Database (Denmark)
Mørck, Line Lerche; Khawaja, Iram
2009-01-01
abstract This article focuses on the complex and multi-layered process of researcher positioning, specifically in relation to the politically sensitive study of marginalised and ‘othered' groups such as Muslims living in Denmark. We discuss the impact of different ethnic, religious and racial...... political and personal involvement by the researcher, which challenges traditional perspectives on research and researcher positioning. A key point in this regard is the importance of constant awareness of and reflection on the multiple ways in which one's positioning as a researcher influences the research...
Statistical measures of galaxy clustering
International Nuclear Information System (INIS)
Porter, D.H.
1988-01-01
Consideration is given to the large-scale distribution of galaxies and ways in which this distribution may be statistically measured. Galaxy clustering is hierarchical in nature, so that the positions of clusters of galaxies are themselves spatially clustered. A simple identification of groups of galaxies would be an inadequate description of the true richness of galaxy clustering. Current observations of the large-scale structure of the universe and modern theories of cosmology may be studied with a statistical description of the spatial and velocity distributions of galaxies. 8 refs
Significant NRC Enforcement Actions
Nuclear Regulatory Commission — This dataset provides a list of Nuclear Regulartory Commission (NRC) issued significant enforcement actions. These actions, referred to as "escalated", are issued by...
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.
International Nuclear Information System (INIS)
Eisenberg, R.L.; Dennis, C.A.; May, C.
1989-01-01
This book concentrates on the routine radiographic examinations commonly performed. It details the wide variety of examinations possible and their place in initial learning and in the radiology department as references for those occasions when an unusual examination is requested. This book provides information ranging from basic terminology to skeletal positioning to special procedures. Positions are discussed and supplemented with a picture of a patient, the resulting radiograph, and a labeled diagram. Immobilization and proper shielding of the patient are also shown
International Nuclear Information System (INIS)
Goursky, Vsevolod
1975-01-01
A circuitry for deriving the quotient of signal delivered by position-sensitive detectors is described. Digital output is obtained in the form of 10- to 12-bit words. Impact position may be determined with 0.25% accuracy when the dynamic range of the energy signal is less 1:10, and 0.5% accuracy when the dynamic range is 1:20. The division requires an average time of 5μs for 10-bit words
International Nuclear Information System (INIS)
Goursky, V.
1975-05-01
This paper describes circuitry for deriving the quotient of signals delivered by position-sensitive detectors. Digital output is obtained in the form of 10 to 12 bit words. Impact position may be determined with 0.25% accuracy when the dynamic range of the energy signal is less than 1:10, and 0.5% accuracy when the dynamic range is 1:20. The division requires an average time of 5μs for 10-bit words [fr
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
International Nuclear Information System (INIS)
Dienes, J.K.
1993-01-01
Although it is possible to simulate the ground blast from a single explosive shot with a simple computer algorithm and appropriate constants, the most commonly used modelling methods do not account for major changes in geology or shot energy because mechanical features such as tectonic stresses, fault structure, microcracking, brittle-ductile transition, and water content are not represented in significant detail. An alternative approach for modelling called Statistical Crack Mechanics is presented in this paper. This method, developed in the seventies as a part of the oil shale program, accounts for crack opening, shear, growth, and coalescence. Numerous photographs and micrographs show that shocked materials tend to involve arrays of planar cracks. The approach described here provides a way to account for microstructure and give a representation of the physical behavior of a material at the microscopic level that can account for phenomena such as permeability, fragmentation, shear banding, and hot-spot formation in explosives
Graphene Statistical Mechanics
Bowick, Mark; Kosmrlj, Andrej; Nelson, David; Sknepnek, Rastko
2015-03-01
Graphene provides an ideal system to test the statistical mechanics of thermally fluctuating elastic membranes. The high Young's modulus of graphene means that thermal fluctuations over even small length scales significantly stiffen the renormalized bending rigidity. We study the effect of thermal fluctuations on graphene ribbons of width W and length L, pinned at one end, via coarse-grained Molecular Dynamics simulations and compare with analytic predictions of the scaling of width-averaged root-mean-squared height fluctuations as a function of distance along the ribbon. Scaling collapse as a function of W and L also allows us to extract the scaling exponent eta governing the long-wavelength stiffening of the bending rigidity. A full understanding of the geometry-dependent mechanical properties of graphene, including arrays of cuts, may allow the design of a variety of modular elements with desired mechanical properties starting from pure graphene alone. Supported by NSF grant DMR-1435794
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
Recreational Boating Statistics 2011
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...
Tuberculosis Data and Statistics
... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Hefetz, Dan; Stojaković, Miloš; Szabó, Tibor
2014-01-01
This text serves as a thorough introduction to the rapidly developing field of positional games. This area constitutes an important branch of combinatorics, whose aim it is to systematically develop an extensive mathematical basis for a variety of two-player perfect information games. These range from such popular games as Tic-Tac-Toe and Hex to purely abstract games played on graphs and hypergraphs. The subject of positional games is strongly related to several other branches of combinatorics such as Ramsey theory, extremal graph and set theory, and the probabilistic method. These notes cover a variety of topics in positional games, including both classical results and recent important developments. They are presented in an accessible way and are accompanied by exercises of varying difficulty, helping the reader to better understand the theory. The text will benefit both researchers and graduate students in combinatorics and adjacent fields.
Interactive statistics with ILLMO
Martens, J.B.O.S.
2014-01-01
Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Youth Sports Safety Statistics
... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...
On two methods of statistical image analysis
Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.
1999-01-01
The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,
Significant Association of Streptococcus bovis with Malignant Gastrointestinal Diseases
Directory of Open Access Journals (Sweden)
Salah Shanan
2011-01-01
Full Text Available Streptococcus bovis is a Gram-positive bacterium causing serious human infections, including endocarditis and bacteremia, and is usually associated with underlying disease. The aims of the current study were to compare prevalence of the bacterium associated with malignant and nonmalignant gastrointestinal diseases and to determine the susceptibility of the isolated strains to different antimicrobial agents. The result showed that the prevalence of S. bovis in stool specimens from patients with malignant or with nonmalignant gastrointestinal diseases was statistically significant. This result may support the idea that there is correlation between S. bovis and the malignant gastrointestinal diseases.
Statistical phenomena - experiments results. II
International Nuclear Information System (INIS)
Schnell, W.
1977-01-01
The stochastic cooling of proton and antiproton beams is discussed. Stochastic cooling is the gradual reduction of emittance of a coasting beam by a feedback system, sensing and correcting the statistical fluctuations of the beam's position or momentum. The correction at every turn can be partial or complete. Transverse and longitudinal emittance of the beam are considered and the systems designed to cool the beams are described. (B.D.)
The Significance of Normativity
DEFF Research Database (Denmark)
Presskorn-Thygesen, Thomas
and Weber. It does so by situating Durkheim and Weber in the context of Neo-Kantian philosophy, which prevailed among their contemporaries, and the chapter thereby reveals a series of under-thematised similarities not only with regard to their methodological positions, but also in their conception of social...... of social theory. In pursuing this overall research agenda, the dissertation contributes to a number of specific research literatures. Following two introductory and methodological chapters, Chapter 3 thus critically examines the analysis of normativity suggested in the recent attempts at transforming...... the methods of neo-classical economics into a broader form of social theory. The chapter thereby contributes to the critical discourses, particularly in philosophy of science, that challenge the validity of neo-classical economics and its underlying conception of practical rationality. In examining...
DEFF Research Database (Denmark)
Khawaja, Iram; Mørck, Line Lerche
2009-01-01
involvement by the researcher, which challenges traditional perspectives onresearch and researcher positioning. A key point in this regard is the importance ofconstant awareness of and reflection on the multiple ways in which one's positioningas a researcher influences the research process. Studying the other...
International Nuclear Information System (INIS)
Hayakawa, Toshifumi.
1985-01-01
Purpose: To enable to detect the position of an moving object in a control rod position detector, stably in a digital manner at a high accuracy and free from the undesired effects of circumstantial conditions such as the reactor temperature. Constitution: Coils connected in parallel with each other are disposed along the passage of a moving object and variable resistors and relays are connected in series with each of the coils respectively. Light emitting diodes is connected in series with the contacts of the respective relays. The resistance value of the variable resistors are adjusted depending on the changes in the circumstantial conditions and temperature distribution upon carrying out the positional detection. When the object is inserted into a coils, the relevant relay is deenergized, by which the relay contacts are closed to light up the diode. In the same manner, as the object is successively inserted into the coils, the diodes are lighted-up successively thereby enabling highly accurate and stable positional detection in a digital manner, free from the undesired effects of the circumstantial conditions. (Horiuchi, T.)
Shot Group Statistics for Small Arms Applications
2017-06-01
if its probability distribution is known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown... statistical inference on the unknown, population standard deviations of the x and y impact-point positions. The dispersion measures treated in this report...known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown, population standard deviations of the x and y
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Kim, E.; Newton, A. P.
2012-04-01
One major problem in dynamo theory is the multi-scale nature of the MHD turbulence, which requires statistical theory in terms of probability distribution functions. In this contribution, we present the statistical theory of magnetic fields in a simplified mean field α-Ω dynamo model by varying the statistical property of alpha, including marginal stability and intermittency, and then utilize observational data of solar activity to fine-tune the mean field dynamo model. Specifically, we first present a comprehensive investigation into the effect of the stochastic parameters in a simplified α-Ω dynamo model. Through considering the manifold of marginal stability (the region of parameter space where the mean growth rate is zero), we show that stochastic fluctuations are conductive to dynamo. Furthermore, by considering the cases of fluctuating alpha that are periodic and Gaussian coloured random noise with identical characteristic time-scales and fluctuating amplitudes, we show that the transition to dynamo is significantly facilitated for stochastic alpha with random noise. Furthermore, we show that probability density functions (PDFs) of the growth-rate, magnetic field and magnetic energy can provide a wealth of useful information regarding the dynamo behaviour/intermittency. Finally, the precise statistical property of the dynamo such as temporal correlation and fluctuating amplitude is found to be dependent on the distribution the fluctuations of stochastic parameters. We then use observations of solar activity to constrain parameters relating to the effect in stochastic α-Ω nonlinear dynamo models. This is achieved through performing a comprehensive statistical comparison by computing PDFs of solar activity from observations and from our simulation of mean field dynamo model. The observational data that are used are the time history of solar activity inferred for C14 data in the past 11000 years on a long time scale and direct observations of the sun spot
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
Dunbar, P. K.; Furtney, M.; McLean, S. J.; Sweeney, A. D.
2014-12-01
Tsunamis have inflicted death and destruction on the coastlines of the world throughout history. The occurrence of tsunamis and the resulting effects have been collected and studied as far back as the second millennium B.C. The knowledge gained from cataloging and examining these events has led to significant changes in our understanding of tsunamis, tsunami sources, and methods to mitigate the effects of tsunamis. The most significant, not surprisingly, are often the most devastating, such as the 2011 Tohoku, Japan earthquake and tsunami. The goal of this poster is to give a brief overview of the occurrence of tsunamis and then focus specifically on several significant tsunamis. There are various criteria to determine the most significant tsunamis: the number of deaths, amount of damage, maximum runup height, had a major impact on tsunami science or policy, etc. As a result, descriptions will include some of the most costly (2011 Tohoku, Japan), the most deadly (2004 Sumatra, 1883 Krakatau), and the highest runup ever observed (1958 Lituya Bay, Alaska). The discovery of the Cascadia subduction zone as the source of the 1700 Japanese "Orphan" tsunami and a future tsunami threat to the U.S. northwest coast, contributed to the decision to form the U.S. National Tsunami Hazard Mitigation Program. The great Lisbon earthquake of 1755 marked the beginning of the modern era of seismology. Knowledge gained from the 1964 Alaska earthquake and tsunami helped confirm the theory of plate tectonics. The 1946 Alaska, 1952 Kuril Islands, 1960 Chile, 1964 Alaska, and the 2004 Banda Aceh, tsunamis all resulted in warning centers or systems being established.The data descriptions on this poster were extracted from NOAA's National Geophysical Data Center (NGDC) global historical tsunami database. Additional information about these tsunamis, as well as water level data can be found by accessing the NGDC website www.ngdc.noaa.gov/hazard/
Gheorghe Gh. IONESCU; Adina Letitia NEGRUSA
2004-01-01
Maintaining positive work-force relationships includes in effective labor-management relations and making appropriate responses to current employee issues. Among the major current employee issues are protection from arbitrary dismissal, drug and alcohol abuse, privacy rights and family maters and they impact work. In our paper we discus two problems: first, the meanings of industrial democracy; second, the three principal operational concepts of industrial democracy (1) industrial democracy t...
Conversion factors and oil statistics
International Nuclear Information System (INIS)
Karbuz, Sohbet
2004-01-01
World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Mineral industry statistics 1975
Energy Technology Data Exchange (ETDEWEB)
1978-01-01
Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Safety significance evaluation system
International Nuclear Information System (INIS)
Lew, B.S.; Yee, D.; Brewer, W.K.; Quattro, P.J.; Kirby, K.D.
1991-01-01
This paper reports that the Pacific Gas and Electric Company (PG and E), in cooperation with ABZ, Incorporated and Science Applications International Corporation (SAIC), investigated the use of artificial intelligence-based programming techniques to assist utility personnel in regulatory compliance problems. The result of this investigation is that artificial intelligence-based programming techniques can successfully be applied to this problem. To demonstrate this, a general methodology was developed and several prototype systems based on this methodology were developed. The prototypes address U.S. Nuclear Regulatory Commission (NRC) event reportability requirements, technical specification compliance based on plant equipment status, and quality assurance assistance. This collection of prototype modules is named the safety significance evaluation system
Predicting significant torso trauma.
Nirula, Ram; Talmor, Daniel; Brasel, Karen
2005-07-01
Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.
Gas revenue increasingly significant
International Nuclear Information System (INIS)
Megill, R.E.
1991-01-01
This paper briefly describes the wellhead prices of natural gas compared to crude oil over the past 70 years. Although natural gas prices have never reached price parity with crude oil, the relative value of a gas BTU has been increasing. It is one of the reasons that the total amount of money coming from natural gas wells is becoming more significant. From 1920 to 1955 the revenue at the wellhead for natural gas was only about 10% of the money received by producers. Most of the money needed for exploration, development, and production came from crude oil. At present, however, over 40% of the money from the upstream portion of the petroleum industry is from natural gas. As a result, in a few short years natural gas may become 50% of the money revenues generated from wellhead production facilities
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Testing statistical hypotheses of equivalence
Wellek, Stefan
2010-01-01
Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the
International Nuclear Information System (INIS)
Supe, S.J.; Nagalaxmi, K.V.; Meenakshi, L.
1983-01-01
In the practice of radiotherapy, various concepts like NSD, CRE, TDF, and BIR are being used to evaluate the biological effectiveness of the treatment schedules on the normal tissues. This has been accepted as the tolerance of the normal tissue is the limiting factor in the treatment of cancers. At present when various schedules are tried, attention is therefore paid to the biological damage of the normal tissues only and it is expected that the damage to the cancerous tissues would be extensive enough to control the cancer. Attempt is made in the present work to evaluate the concent of tumor significant dose (TSD) which will represent the damage to the cancerous tissue. Strandquist in the analysis of a large number of cases of squamous cell carcinoma found that for the 5 fraction/week treatment, the total dose required to bring about the same damage for the cancerous tissue is proportional to T/sup -0.22/, where T is the overall time over which the dose is delivered. Using this finding the TSD was defined as DxN/sup -p/xT/sup -q/, where D is the total dose, N the number of fractions, T the overall time p and q are the exponents to be suitably chosen. The values of p and q are adjusted such that p+q< or =0.24, and p varies from 0.0 to 0.24 and q varies from 0.0 to 0.22. Cases of cancer of cervix uteri treated between 1978 and 1980 in the V. N. Cancer Centre, Kuppuswamy Naidu Memorial Hospital, Coimbatore, India were analyzed on the basis of these formulations. These data, coupled with the clinical experience, were used for choice of a formula for the TSD. Further, the dose schedules used in the British Institute of Radiology fraction- ation studies were also used to propose that the tumor significant dose is represented by DxN/sup -0.18/xT/sup -0.06/
THE SIGNIFICANCE OF INTERCULTURAL COMPETENCE IN CROSS-CULTURAL COMMUNICATION
Directory of Open Access Journals (Sweden)
Jadranka Zlomislić
2016-12-01
Full Text Available The aim of this study is to explore the influence of education and additional factors influencing students’ awareness of intercultural differences. For the purposes of this research assessment was carried out with regard to their role in promoting cultural awareness and facing cross-cultural challenges posed by unfamiliar cross-cultural contexts. Cultural education is presumed to be a key factor for achieving a significant increase of cultural sensitivity and cultural awareness in order to ensure successful cross-cultural communication and increase mobility of students/working professionals. For this study, it was assumed that the cultural awareness of students increases due to the courses they take and their overall study experience. A special questionnaire was developed for the purposes of this research, and the obtained results were statistically analyzed with the help of descriptive statistics, the non-parametric chi-square test, and the Mann-Whitney test. The research has shown that intercultural competence has a statistically significant positive effect on the readiness of students to participate in study and work programs abroad. Thus, it is mandatory that foreign language competence as well as intercultural competence be a priority of the curriculum if we are to increase the number of highly educated experts who will be capable to compete successfully as students or professionals in all fields and all cultural areas. If we recognize that globalization has made the world a global village, we all need the intercultural competence to successfully live in it.
Uranium chemistry: significant advances
International Nuclear Information System (INIS)
Mazzanti, M.
2011-01-01
The author reviews recent progress in uranium chemistry achieved in CEA laboratories. Like its neighbors in the Mendeleev chart uranium undergoes hydrolysis, oxidation and disproportionation reactions which make the chemistry of these species in water highly complex. The study of the chemistry of uranium in an anhydrous medium has led to correlate the structural and electronic differences observed in the interaction of uranium(III) and the lanthanides(III) with nitrogen or sulfur molecules and the effectiveness of these molecules in An(III)/Ln(III) separation via liquid-liquid extraction. Recent work on the redox reactivity of trivalent uranium U(III) in an organic medium with molecules such as water or an azide ion (N 3 - ) in stoichiometric quantities, led to extremely interesting uranium aggregates particular those involved in actinide migration in the environment or in aggregation problems in the fuel processing cycle. Another significant advance was the discovery of a compound containing the uranyl ion with a degree of oxidation (V) UO 2 + , obtained by oxidation of uranium(III). Recently chemists have succeeded in blocking the disproportionation reaction of uranyl(V) and in stabilizing polymetallic complexes of uranyl(V), opening the way to to a systematic study of the reactivity and the electronic and magnetic properties of uranyl(V) compounds. (A.C.)
Directory of Open Access Journals (Sweden)
Ph D Student Roman Mihaela
2011-05-01
Full Text Available The concept of "public accountability" is a challenge for political science as a new concept in this area in full debate and developement ,both in theory and practice. This paper is a theoretical approach of displaying some definitions, relevant meanings and significance odf the concept in political science. The importance of this concept is that although originally it was used as a tool to improve effectiveness and eficiency of public governance, it has gradually become a purpose it itself. "Accountability" has become an image of good governance first in the United States of America then in the European Union.Nevertheless,the concept is vaguely defined and provides ambiguous images of good governance.This paper begins with the presentation of some general meanings of the concept as they emerge from specialized dictionaries and ancyclopaedies and continues with the meanings developed in political science. The concept of "public accontability" is rooted in economics and management literature,becoming increasingly relevant in today's political science both in theory and discourse as well as in practice in formulating and evaluating public policies. A first conclusin that emerges from, the analysis of the evolution of this term is that it requires a conceptual clarification in political science. A clear definition will then enable an appropriate model of proving the system of public accountability in formulating and assessing public policies, in order to implement a system of assessment and monitoring thereof.
Robot Trajectories Comparison: A Statistical Approach
Directory of Open Access Journals (Sweden)
A. Ansuategui
2014-01-01
Full Text Available The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners.
Robot Trajectories Comparison: A Statistical Approach
Ansuategui, A.; Arruti, A.; Susperregi, L.; Yurramendi, Y.; Jauregi, E.; Lazkano, E.; Sierra, B.
2014-01-01
The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners. PMID:25525618
Significant Radionuclides Determination
Energy Technology Data Exchange (ETDEWEB)
Jo A. Ziegler
2001-07-31
The purpose of this calculation is to identify radionuclides that are significant to offsite doses from potential preclosure events for spent nuclear fuel (SNF) and high-level radioactive waste expected to be received at the potential Monitored Geologic Repository (MGR). In this calculation, high-level radioactive waste is included in references to DOE SNF. A previous document, ''DOE SNF DBE Offsite Dose Calculations'' (CRWMS M&O 1999b), calculated the source terms and offsite doses for Department of Energy (DOE) and Naval SNF for use in design basis event analyses. This calculation reproduces only DOE SNF work (i.e., no naval SNF work is included in this calculation) created in ''DOE SNF DBE Offsite Dose Calculations'' and expands the calculation to include DOE SNF expected to produce a high dose consequence (even though the quantity of the SNF is expected to be small) and SNF owned by commercial nuclear power producers. The calculation does not address any specific off-normal/DBE event scenarios for receiving, handling, or packaging of SNF. The results of this calculation are developed for comparative analysis to establish the important radionuclides and do not represent the final source terms to be used for license application. This calculation will be used as input to preclosure safety analyses and is performed in accordance with procedure AP-3.12Q, ''Calculations'', and is subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000) as determined by the activity evaluation contained in ''Technical Work Plan for: Preclosure Safety Analysis, TWP-MGR-SE-000010'' (CRWMS M&O 2000b) in accordance with procedure AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''.
Indian Academy of Sciences (India)
IAS Admin
Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.
Generalized interpolative quantum statistics
International Nuclear Information System (INIS)
Ramanathan, R.
1992-01-01
A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently
Handbook of Spatial Statistics
Gelfand, Alan E
2010-01-01
Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.
International Nuclear Information System (INIS)
2003-01-01
The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)
Bayesian statistical inference
Directory of Open Access Journals (Sweden)
Bruno De Finetti
2017-04-01
Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.
Practical statistics for educators
Ravid, Ruth
2014-01-01
Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Howard Stauffer; Nadav Nur
2005-01-01
The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Temperature dependent anomalous statistics
International Nuclear Information System (INIS)
Das, A.; Panda, S.
1991-07-01
We show that the anomalous statistics which arises in 2 + 1 dimensional Chern-Simons gauge theories can become temperature dependent in the most natural way. We analyze and show that a statistic's changing phase transition can happen in these theories only as T → ∞. (author). 14 refs
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
The significance of early post-exercise ST segment normalization.
Chow, Rudy; Fordyce, Christopher B; Gao, Min; Chan, Sammy; Gin, Kenneth; Bennett, Matthew
2015-01-01
The persistence of ST segment depression in recovery signifies a strongly positive exercise treadmill test (ETT). However, it is unclear if early recovery of ST segments portends a similar prognosis. We sought to determine if persistence of ST depression into recovery correlates with ischemic burden based on myocardial perfusion imaging (MPI). This was a retrospective analysis of 853 consecutive patients referred for exercise MPI at a tertiary academic center over a 24-month period. Patients were stratified into three groups based on the results of the ETT: normal (negative ETT), persistence (positive ETT with >1mm ST segment depression at 1minute in recovery) and early normalization (positive ETT with normalization, while 105 patients met criteria for persistence. The persistence group had a significantly greater SSS (8.48±7.77) than both the early normalization (4.34±4.98, pnormal (4.47±5.31, pnormalization and normal groups were not statistically different and met the prespecified non-inferiority margin (mean difference 0.12, -0.66=lower 95% CI, pnormal and 7.4% of early normalization groups. Among patients with an electrically positive ETT, recovery of ST segment depression within 1minute was associated with a lower SSS than patients with persistence of ST depression beyond 1minute. Furthermore, early ST segment recovery conferred a similar SSS to patients with a negative ETT. These results suggest that among patients evaluated for chest pain with a positive ETT, early recovery of the ST segment during recovery is associated with a significantly less ischemic burden on subsequent MPI and thus may represent a false positive finding in exercise treadmill testing. Copyright © 2015 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Mauro, John
2013-01-01
Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Mineral statistics yearbook 1994
International Nuclear Information System (INIS)
1994-01-01
A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Methods of statistical physics
Akhiezer, Aleksandr I
1981-01-01
Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be
National Research Council Canada - National Science Library
Matsakis, Demetrios
2007-01-01
The Global Positioning System (GPS) is an extremely effective satellite-based system that broadcasts sufficient information for a user to determine time and position from any location on or near the Earth...
Cancer Data and Statistics Tools
... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...
Breast cancer statistics, 2011.
DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin
2011-01-01
In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population. Copyright © 2011 American Cancer Society, Inc.
Methods and statistics for combining motif match scores.
Bailey, T L; Gribskov, M
1998-01-01
Position-specific scoring matrices are useful for representing and searching for protein sequence motifs. A sequence family can often be described by a group of one or more motifs, and an effective search must combine the scores for matching a sequence to each of the motifs in the group. We describe three methods for combining match scores and estimating the statistical significance of the combined scores and evaluate the search quality (classification accuracy) and the accuracy of the estimate of statistical significance of each. The three methods are: 1) sum of scores, 2) sum of reduced variates, 3) product of score p-values. We show that method 3) is superior to the other two methods in both regards, and that combining motif scores indeed gives better search accuracy. The MAST sequence homology search algorithm utilizing the product of p-values scoring method is available for interactive use and downloading at URL http:/(/)www.sdsc.edu/MEME.
Elements of statistical thermodynamics
Nash, Leonard K
2006-01-01
Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.
Department of Veterans Affairs — National-level, VISN-level, and/or VAMC-level statistics on the numbers and percentages of users of VHA care form the Northeast Program Evaluation Center (NEPEC)....
International Nuclear Information System (INIS)
Hilaire, S.
2001-01-01
A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... them at higher risk. Reported Cases of Human Plague - United States, 1970-2016 Since the mid–20th ...
Statistical Measures of Marksmanship
National Research Council Canada - National Science Library
Johnson, Richard
2001-01-01
.... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...
Titanic: A Statistical Exploration.
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
... About Us Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir Sickle cell ... 1999 through 2002. This drop coincided with the introduction in 2000 of a vaccine that protects against ...
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Statistical electromagnetics: Complex cavities
Naus, H.W.L.
2008-01-01
A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased
Davison, Anthony C.; Huser, Raphaë l
2015-01-01
Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
CSIR Research Space (South Africa)
Shepperson, L
1997-12-01
Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...
Genton, Marc G.; Castruccio, Stefano; Crippa, Paola; Dutta, Subhajit; Huser, Raphaë l; Sun, Ying; Vettori, Sabrina
2015-01-01
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Illinois travel statistics, 2008
2009-01-01
The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2009
2010-01-01
The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Cholesterol Facts and Statistics
... Managing High Cholesterol Cholesterol-lowering Medicine High Cholesterol Statistics and Maps High Cholesterol Facts High Cholesterol Maps ... Deo R, et al. Heart disease and stroke statistics—2017 update: a report from the American Heart ...
Illinois travel statistics, 2010
2011-01-01
The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...
Information theory and statistics
Kullback, Solomon
1968-01-01
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Medicaid Drug Claims Statistics
U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Record Statistics and Dynamics
DEFF Research Database (Denmark)
Sibani, Paolo; Jensen, Henrik J.
2009-01-01
with independent random increments. The term record dynamics covers the rather new idea that records may, in special situations, have measurable dynamical consequences. The approach applies to the aging dynamics of glasses and other systems with multiple metastable states. The basic idea is that record sizes...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
Statistical mechanics rigorous results
Ruelle, David
1999-01-01
This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.
Statistical mechanics of anyons
International Nuclear Information System (INIS)
Arovas, D.P.
1985-01-01
We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)
Mulholland, Henry
1968-01-01
Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges
Business statistics I essentials
Clark, Louise
2014-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t
Johnson, Norman
This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...
Practical Statistics for Particle Physicists
Lista, Luca
2017-01-01
These three lectures provide an introduction to the main concepts of statistical data analysis useful for precision measurements and searches for new signals in High Energy Physics. The frequentist and Bayesian approaches to probability theory will introduced and, for both approaches, inference methods will be presented. Hypothesis tests will be discussed, then significance and upper limit evaluation will be presented with an overview of the modern and most advanced techniques adopted for data analysis at the Large Hadron Collider.
Borsboom, D.; Haig, B.D.
2013-01-01
Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science
UN Data- Environmental Statistics: Waste
World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...
UN Data: Environment Statistics: Waste
World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...
Encounter Probability of Significant Wave Height
DEFF Research Database (Denmark)
Liu, Z.; Burcharth, H. F.
The determination of the design wave height (often given as the significant wave height) is usually based on statistical analysis of long-term extreme wave height measurement or hindcast. The result of such extreme wave height analysis is often given as the design wave height corresponding to a c...
Statistical Attitude Determination
Markley, F. Landis
2010-01-01
All spacecraft require attitude determination at some level of accuracy. This can be a very coarse requirement of tens of degrees, in order to point solar arrays at the sun, or a very fine requirement in the milliarcsecond range, as required by Hubble Space Telescope. A toolbox of attitude determination methods, applicable across this wide range, has been developed over the years. There have been many advances in the thirty years since the publication of Reference, but the fundamentals remain the same. One significant change is that onboard attitude determination has largely superseded ground-based attitude determination, due to the greatly increased power of onboard computers. The availability of relatively inexpensive radiation-hardened microprocessors has led to the development of "smart" sensors, with autonomous star trackers being the first spacecraft application. Another new development is attitude determination using interferometry of radio signals from the Global Positioning System (GPS) constellation. This article reviews both the classic material and these newer developments at approximately the level of, with emphasis on. methods suitable for use onboard a spacecraft. We discuss both "single frame" methods that are based on measurements taken at a single point in time, and sequential methods that use information about spacecraft dynamics to combine the information from a time series of measurements.
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
International Nuclear Information System (INIS)
Holttinen, H.; Tammelin, B.; Hyvoenen, R.
1997-01-01
The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Multivariate Statistical Process Control
DEFF Research Database (Denmark)
Kulahci, Murat
2013-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
1992 Energy statistics Yearbook
International Nuclear Information System (INIS)
1994-01-01
The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from annual questionnaires distributed by the United Nations Statistical Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistical Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities