The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Kahane, Guy
2014-01-01
The universe that surrounds us is vast, and we are so very small. When we reflect on the vastness of the universe, our humdrum cosmic location, and the inevitable future demise of humanity, our lives can seem utterly insignificant. Many philosophers assume that such worries about our significance reflect a banal metaethical confusion. They dismiss the very idea of cosmic significance. This, I argue, is a mistake. Worries about cosmic insignificance do not express metaethical worries about objectivity or nihilism, and we can make good sense of the idea of cosmic significance and its absence. It is also possible to explain why the vastness of the universe can make us feel insignificant. This impression does turn out to be mistaken, but not for the reasons typically assumed. In fact, we might be of immense cosmic significance—though we cannot, at this point, tell whether this is the case. PMID:25729095
On the insignificance of Herschel's sunspot correlation
Love, Jeffrey J.
2013-08-01
We examine William Herschel's hypothesis that solar-cycle variation of the Sun's irradiance has a modulating effect on the Earth's climate and that this is, specifically, manifested as an anticorrelation between sunspot number and the market price of wheat. Since Herschel first proposed his hypothesis in 1801, it has been regarded with both interest and skepticism. Recently, reports have been published that either support Herschel's hypothesis or rely on its validity. As a test of Herschel's hypothesis, we seek to reject a null hypothesis of a statistically random correlation between historical sunspot numbers, wheat prices in London and the United States, and wheat farm yields in the United States. We employ binary-correlation, Pearson-correlation, and frequency-domain methods. We test our methods using a historical geomagnetic activity index, well known to be causally correlated with sunspot number. As expected, the measured correlation between sunspot number and geomagnetic activity would be an unlikely realization of random data; the correlation is "statistically significant." On the other hand, measured correlations between sunspot number and wheat price and wheat yield data would be very likely realizations of random data; these correlations are "insignificant." Therefore, Herschel's hypothesis must be regarded with skepticism. We compare and contrast our results with those of other researchers. We discuss procedures for evaluating hypotheses that are formulated from historical data.
Insignificant disease among men with intermediate-risk prostate cancer.
Hong, Sung Kyu; Vertosick, Emily; Sjoberg, Daniel D; Scardino, Peter T; Eastham, James A
2014-12-01
A paucity of data exists on the insignificant disease potentially suitable for active surveillance (AS) among men with intermediate-risk prostate cancer (PCa). We tried to identify pathologically insignificant disease and its preoperative predictors in men who underwent radical prostatectomy (RP) for intermediate-risk PCa. We analyzed data of 1,630 men who underwent RP for intermediate-risk disease. Total tumor volume (TTV) data were available in 332 men. We examined factors associated with classically defined pathologically insignificant cancer (organ-confined disease with TTV ≤0.5 ml with no Gleason pattern 4 or 5) and pathologically favorable cancer (organ-confined disease with no Gleason pattern 4 or 5) potentially suitable for AS. Decision curve analysis was used to assess clinical utility of a multivariable model including preoperative variables for predicting pathologically unfavorable cancer. In the entire cohort, 221 of 1,630 (13.6 %) total patients had pathologically favorable cancer. Among 332 patients with TTV data available, 26 (7.8 %) had classically defined pathologically insignificant cancer. Between threshold probabilities of 20 and 40 %, decision curve analysis demonstrated that using multivariable model to identify AS candidates would not provide any benefit over simply treating all men who have intermediate-risk disease with RP. Although a minority of patients with intermediate-risk disease may harbor pathologically favorable or insignificant cancer, currently available conventional tools are not sufficiently able to identify those patients.
Qualification of Acts in the Context of Insignificance: Theoretical and Practical Issues
Directory of Open Access Journals (Sweden)
Irina G. Ragozina
2016-01-01
Full Text Available The article deals with the qualification of insignificance of act problematic issues, the impact of judicial discretion in the formation of judicial practice on the assessment of insignificance is reflected. The basic signs of insignificance are described, the conditions for recognition insignificant acts are defined. The authors come to the conclusion to exclude part 2 of Art. 14 of the Criminal Code of the Russian Federation and amend it by the article "Exemption from the criminal liability in connection with acts of insignificance", limiting the range of categories of such acts only to minor offenses committed by first time.
International Nuclear Information System (INIS)
Agus, M; Penna, M P; Peró-Cebollero, M; Guàrdia-Olmos, J
2015-01-01
Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics
Shukla-Dave, Amita; Hricak, Hedvig; Akin, Oguz; Yu, Changhong; Zakian, Kristen L.; Udo, Kazuma; Scardino, Peter T.; Eastham, James; Kattan, Michael W.
2011-01-01
Objectives • To validate previously published nomograms for predicting insignificant prostate cancer (PCa) that incorporate clinical data, percentage of biopsy cores positive (%BC+) and magnetic resonance imaging (MRI) or MRI/MR spectroscopic imaging (MRSI) results. • We also designed new nomogram models incorporating magnetic resonance results and clinical data without detailed biopsy data. • Nomograms for predicting insignificant PCa can help physicians counsel patients with clinically low-risk disease who are choosing between active surveillance and definitive therapy. Patients and methods • In total, 181 low-risk PCa patients (clinical stage T1c–T2a, prostate-specific antigen level < 10 ng/mL, biopsy Gleason score of 6) had MRI/MRSI before surgery. • For MRI and MRI/MRSI, the probability of insignificant PCa was recorded prospectively and independently by two radiologists on a scale from 0 (definitely insignificant) to 3 (definitely significant PCa). • Insignificant PCa was defined on surgical pathology. • There were four models incorporating MRI or MRI/MRSI and clinical data with and without %BC+ that were compared with a base clinical model without %BC and a more comprehensive clinical model with %BC+. • Prediction accuracy was assessed using areas under receiver–operator characteristic curves. Results • At pathology, 27% of patients had insignificant PCa, and the Gleason score was upgraded in 56.4% of patients. • For both readers, all magnetic resonance models performed significantly better than the base clinical model (P ≤ 0.05 for all) and similarly to the more comprehensive clinical model. Conclusions • Existing models incorporating magnetic resonance data, clinical data and %BC+ for predicting the probability of insignificant PCa were validated. • All MR-inclusive models performed significantly better than the base clinical model. PMID:21933336
International Nuclear Information System (INIS)
Webb, G.A.M.; McLean, A.S.
1977-01-01
The procedures recommended by the International Commission on Radiological Protection (ICRP) for making decisions concerning controllable sources of radiation exposure of the public include 'justification' and 'optimisation'. The tool recommended by the ICRP for reaching these decisions is collective dose or dose commitment supplemented by consideration of doses to individuals. In both these considerations the practical problem arises of whether very small doses to large numbers of people should contribute to the final decision-making process. It may be that at levels of dose which are small increments on natural background, the relationship between dose and effect is linear even though the slope may be close to zero. If so, collective dose is a meaningful concept and the calculation of total detriment for the purpose of justification could legitimately include all doses. In the calculation of collective doses for the purpose of optimisation, which involves decisions on how much money or resource should be allocated to dose reduction, it is necessary to appraise radiation detriment realistically. At low levels of dose to the individual such as those small by comparison with variations in natural background within the UK, the risk to the individual is such that his well-being will not be significantly changed by the presence or absence of the radiation dose. These small doses, which are well below the point at which an individual attaches significance, should not carry a societal significance. Societal acceptance of risk is analysed with a view to assessing a level of possible risk, and hence dose, below which resources should not in general be diverted to secure further reduction. A formulation for collective dose commitment is proposed incorporating a cut-off to exclude insignificant doses. The implications of this formulation in practical situations are discussed
Comparative Gender Performance in Business Statistics.
Mogull, Robert G.
1989-01-01
Comparative performance of male and female students in introductory and intermediate statistics classes was examined for over 16 years at a state university. Gender means from 97 classes and 1,609 males and 1,085 females revealed a probabilistic--although statistically insignificant--superior performance by female students that appeared to…
Directory of Open Access Journals (Sweden)
Rafik Ragheb
2009-01-01
Full Text Available In the current study, we used immunoprecipitation and immunoblotting to examine the levels and phosphorylation status of the insulin receptor-beta subunit (IR-β, as well as the down stream target in PI3K pathway, total PKB/Akt as well as their phosphorylated forms. The assessment of FFAs treatment showed no direct and significant effect on the PI3K stimulation, specifically the IR-β in primary hepatic control cells treated with insulin. Cells treated with either oleate or palmitate (360 µM showed no statistically significant values following insulin stimulation (P > 0.05. To further investigate the effect of both FFAs and high insulin (1 µg, we examined the effects of oleate and palmitate at 360 µM concentration on IR-β as well as PKB. There was no significant difference in the total protein levels and their phosphorylated forms in cells treated with or without oleate or plamitate. Interestingly, IR-β tyrosine phosphorylation showed a similar insignificant effect in vivo and ex vivo hepatic cells treated with oleate or palmitate in comparison to their controls in the fructose fed hamsters.
Chondros, Κ; Karpathakis, Ν; Heretis, Ι; Mavromanolakis, Ε; Chondros, N; Sofras, F; Mamoulakis, C
2015-01-01
Different treatment options for patients with prostate cancer (PCa) are applicable after stratifying patients according to various classification criteria. The purpose of our study is to evaluate the revised Epstein's criteria for insignificant PCa prediction in a Greek subpopulation. During a 4-year-period, 172 Cretan patients were submitted to radical retropubic prostatectomy in our institution. 23 out of them met the revised Epstein's criteria for the presence of clinically insignificant PCa (clinical stage T1c, prostate specific antigen density < 0.15 ng/ml/g, absence of Gleason pattern 4-5, <3 positive biopsy cores, presence of <50% tumor per core) during pre-treatment evaluation and were retrospectively included in the study. Post-surgery outcomes were evaluated including pathological stage, surgical margins and Gleason score upgrade. Organ confined disease and insignificant PCa were predicted with a 74% and 31% accuracy, respectively. These figures are remarkably lower than those derived from similar studies worldwide. Due to the high variation in the revised Epstein's criteria prediction accuracy observed worldwide, the development and implementation of novel tools/nomograms with a greater predictive accuracy is still warranted. Hippokratia 2015, 19 (1): 30-33.
International Nuclear Information System (INIS)
Kim, Tae Heon; Jeong, Jae Yong; Lee, Sin Woo; Sung, Hyun Hwan; Jeon, Hwang Gyun; Jeong, Byong Chang; Seo, Seong Il; Lee, Hyun Moo; Choi, Han Yong; Jeon, Seong Soo; Kim, Chan Kyo; Park, Byung Kwan
2015-01-01
To investigate whether the apparent diffusion coefficient (ADC) from diffusion-weighted magnetic resonance imaging (DW-MRI) could help improve the prediction of insignificant prostate cancer in candidates for active surveillance (AS). Enrolled in this retrospective study were 287 AS candidates who underwent DW-MRI before radical prostatectomy. Patients were stratified into two groups; Group A consisted of patients with no visible tumour or a suspected tumour ADC value > 0.830 x 10 -3 mm 2 /sec and Group B consisted of patients with a suspected tumour ADC value < 0.830 x 10 -3 mm 2 /sec. We compared pathological outcomes in each group. Group A had 243 (84.7 %) patients and Group B had 44 (15.3 %) patients. The proportion of organ-confined Gleason ≤ 6 disease and insignificant prostate cancer was significantly higher in Group A than Group B (61.3 % vs. 38.6 %, p = 0.005 and 47.7 % vs. 25.0 %, p = 0.005, respectively). On multivariate analysis, a high ADC value was the independent predictor of organ-confined Gleason ≤ 6 disease and insignificant prostate cancer (odds ratio = 2.43, p = 0.011 and odds ratio = 2.74, p = 0.009, respectively). Tumour ADC values may be a useful marker for predicting insignificant prostate cancer in candidates for AS. (orig.)
Behavioral investment strategy matters: a statistical arbitrage approach
Sun, David; Tsai, Shih-Chuan; Wang, Wei
2011-01-01
In this study, we employ a statistical arbitrage approach to demonstrate that momentum investment strategy tend to work better in periods longer than six months, a result different from findings in past literature. Compared with standard parametric tests, the statistical arbitrage method produces more clearly that momentum strategies work only in longer formation and holding periods. Also they yield positive significant returns in an up market, but negative yet insignificant returns in a down...
Stereochemical insignificance discovered in Acinetobacter baumannii quorum sensing.
Directory of Open Access Journals (Sweden)
Amanda L Garner
Full Text Available Stereochemistry is a key aspect of molecular recognition for biological systems. As such, receptors and enzymes are often highly stereospecific, only recognizing one stereoisomer of a ligand. Recently, the quorum sensing signaling molecules used by the nosocomial opportunistic pathogen, Acinetobacter baumannii, were identified, and the primary signaling molecule isolated from this species was N-(3-hydroxydodecanoyl-L-homoserine lactone. A plethora of bacterial species have been demonstrated to utilize 3-hydroxy-acylhomoserine lactone autoinducers, and in virtually all cases, the (R-stereoisomer was identified as the natural ligand and exhibited greater autoinducer activity than the corresponding (S-stereoisomer. Using chemical synthesis and biochemical assays, we have uncovered a case of stereochemical insignificance in A. baumannii and provide a unique example where stereochemistry appears nonessential for acylhomoserine lactone-mediated quorum sensing signaling. Based on previously reported phylogenetic studies, we suggest that A. baumannii has evolutionarily adopted this unique, yet promiscuous quorum sensing system to ensure its survival, particularly in the presence of other proteobacteria.
Statistical modelling of citation exchange between statistics journals.
Varin, Cristiano; Cattelan, Manuela; Firth, David
2016-01-01
Rankings of scholarly journals based on citation data are often met with scepticism by the scientific community. Part of the scepticism is due to disparity between the common perception of journals' prestige and their ranking based on citation counts. A more serious concern is the inappropriate use of journal rankings to evaluate the scientific influence of researchers. The paper focuses on analysis of the table of cross-citations among a selection of statistics journals. Data are collected from the Web of Science database published by Thomson Reuters. Our results suggest that modelling the exchange of citations between journals is useful to highlight the most prestigious journals, but also that journal citation data are characterized by considerable heterogeneity, which needs to be properly summarized. Inferential conclusions require care to avoid potential overinterpretation of insignificant differences between journal ratings. Comparison with published ratings of institutions from the UK's research assessment exercise shows strong correlation at aggregate level between assessed research quality and journal citation 'export scores' within the discipline of statistics.
Proteins contribute insignificantly to the intrinsic buffering capacity of yeast cytoplasm
International Nuclear Information System (INIS)
Poznanski, Jaroslaw; Szczesny, Pawel; Ruszczyńska, Katarzyna; Zielenkiewicz, Piotr; Paczek, Leszek
2013-01-01
Highlights: ► We predicted buffering capacity of yeast proteome from protein abundance data. ► We measured total buffering capacity of yeast cytoplasm. ► We showed that proteins contribute insignificantly to buffering capacity. -- Abstract: Intracellular pH is maintained by a combination of the passive buffering of cytoplasmic dissociable compounds and several active systems. Over the years, a large portion of and possibly most of the cell’s intrinsic (i.e., passive non-bicarbonate) buffering effect was attributed to proteins, both in higher organisms and in yeast. This attribution was not surprising, given that the concentration of proteins with multiple protonable/deprotonable groups in the cell exceeds the concentration of free protons by a few orders of magnitude. Using data from both high-throughput experiments and in vitro laboratory experiments, we tested this concept. We assessed the buffering capacity of the yeast proteome using protein abundance data and compared it to our own titration of yeast cytoplasm. We showed that the protein contribution is less than 1% of the total intracellular buffering capacity. As confirmed with NMR measurements, inorganic phosphates play a crucial role in the process. These findings also shed a new light on the role of proteomes in maintaining intracellular pH. The contribution of proteins to the intrinsic buffering capacity is negligible, and proteins might act only as a recipient of signals for changes in pH.
Carlsson, Sigrid; Maschino, Alexandra; Schröder, Fritz; Bangma, Chris; Steyerberg, Ewout W; van der Kwast, Theo; van Leenders, Geert; Vickers, Andrew; Lilja, Hans; Roobol, Monique J
2013-11-01
Treatment decisions can be difficult in men with low-risk prostate cancer (PCa). To evaluate the ability of a panel of four kallikrein markers in blood-total prostate-specific antigen (PSA), free PSA, intact PSA, and kallikrein-related peptidase 2-to distinguish between pathologically insignificant and aggressive disease on pathologic examination of radical prostatectomy (RP) specimens as well as to calculate the number of avoidable surgeries. The cohort comprised 392 screened men participating in rounds 1 and 2 of the Rotterdam arm of the European Randomized Study of Screening for Prostate Cancer. Patients were diagnosed with PCa because of an elevated PSA ≥3.0 ng/ml and were treated with RP between 1994 and 2004. We calculated the accuracy (area under the curve [AUC]) of statistical models to predict pathologically aggressive PCa (pT3-T4, extracapsular extension, tumor volume >0.5cm(3), or any Gleason grade ≥4) based on clinical predictors (age, stage, PSA, biopsy findings) with and without levels of four kallikrein markers in blood. A total of 261 patients (67%) had significant disease on pathologic evaluation of the RP specimen. While the clinical model had good accuracy in predicting aggressive disease, reflected in a corrected AUC of 0.81, the four kallikrein markers enhanced the base model, with an AUC of 0.84 (p limitation of the present study is that clinicians may be hesitant to make recommendations against active treatment on the basis of a statistical model. Our study provided proof of principle that predictions based on levels of four kallikrein markers in blood distinguish between pathologically insignificant and aggressive disease after RP with good accuracy. In the future, clinical use of the model could potentially reduce rates of immediate unnecessary active treatment. Copyright © 2013 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
The (mis)reporting of statistical results in psychology journals.
Bakker, Marjan; Wicherts, Jelte M
2011-09-01
In order to study the prevalence, nature (direction), and causes of reporting errors in psychology, we checked the consistency of reported test statistics, degrees of freedom, and p values in a random sample of high- and low-impact psychology journals. In a second study, we established the generality of reporting errors in a random sample of recent psychological articles. Our results, on the basis of 281 articles, indicate that around 18% of statistical results in the psychological literature are incorrectly reported. Inconsistencies were more common in low-impact journals than in high-impact journals. Moreover, around 15% of the articles contained at least one statistical conclusion that proved, upon recalculation, to be incorrect; that is, recalculation rendered the previously significant result insignificant, or vice versa. These errors were often in line with researchers' expectations. We classified the most common errors and contacted authors to shed light on the origins of the errors.
Fractional statistics and the butterfly effect
International Nuclear Information System (INIS)
Gu, Yingfei; Qi, Xiao-Liang
2016-01-01
Fractional statistics and quantum chaos are both phenomena associated with the non-local storage of quantum information. In this article, we point out a connection between the butterfly effect in (1+1)-dimensional rational conformal field theories and fractional statistics in (2+1)-dimensional topologically ordered states. This connection comes from the characterization of the butterfly effect by the out-of-time-order-correlator proposed recently. We show that the late-time behavior of such correlators is determined by universal properties of the rational conformal field theory such as the modular S-matrix and conformal spins. Using the bulk-boundary correspondence between rational conformal field theories and (2+1)-dimensional topologically ordered states, we show that the late time behavior of out-of-time-order-correlators is intrinsically connected with fractional statistics in the topological order. We also propose a quantitative measure of chaos in a rational conformal field theory, which turns out to be determined by the topological entanglement entropy of the corresponding topological order.
Fractional statistics and the butterfly effect
Energy Technology Data Exchange (ETDEWEB)
Gu, Yingfei; Qi, Xiao-Liang [Department of Physics, Stanford University,Stanford, CA 94305 (United States)
2016-08-23
Fractional statistics and quantum chaos are both phenomena associated with the non-local storage of quantum information. In this article, we point out a connection between the butterfly effect in (1+1)-dimensional rational conformal field theories and fractional statistics in (2+1)-dimensional topologically ordered states. This connection comes from the characterization of the butterfly effect by the out-of-time-order-correlator proposed recently. We show that the late-time behavior of such correlators is determined by universal properties of the rational conformal field theory such as the modular S-matrix and conformal spins. Using the bulk-boundary correspondence between rational conformal field theories and (2+1)-dimensional topologically ordered states, we show that the late time behavior of out-of-time-order-correlators is intrinsically connected with fractional statistics in the topological order. We also propose a quantitative measure of chaos in a rational conformal field theory, which turns out to be determined by the topological entanglement entropy of the corresponding topological order.
Fractional statistics and fractional quantized Hall effect
International Nuclear Information System (INIS)
Tao, R.; Wu, Y.S.
1985-01-01
The authors suggest that the origin of the odd-denominator rule observed in the fractional quantized Hall effect (FQHE) may lie in fractional statistics which govern quasiparticles in FQHE. A theorem concerning statistics of clusters of quasiparticles implies that fractional statistics do not allow coexistence of a large number of quasiparticles at fillings with an even denominator. Thus, no Hall plateau can be formed at these fillings, regardless of the presence of an energy gap. 15 references
Is it possible to predict low-volume and insignificant prostate cancer by core needle biopsies?
DEFF Research Database (Denmark)
Berg, Kasper Drimer; Toft, Birgitte Grønkaer; Røder, Martin Andreas
2013-01-01
M: tumour ≤5% of total prostate volume and prostate-specific antigen (PSA) ≤10 ng/mL. In all definitions, Gleason score (GS) was ≤6 and the tumour was organ confined. Biopsies alone performed poorly as a predictor of unifocal and unilateral cancer in the prostatectomy specimens with positive predictive......In an attempt to minimize overtreatment of localized prostate cancer (PCa) active surveillance (AS) and minor invasive procedures have received increased attention. We investigated the accuracy of pre-operative findings in defining insignificant disease and distinguishing between unilateral.......9% and 12.0%, respectively, for identifying InsigM, InsigW and InsigE in the prostate specimen. Conclusively, routine prostate biopsies cannot predict unifocal and unilateral PCa, and must be regarded insufficient to select patients for focal therapy. Although candidates for AS may be identified using...
Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar
2014-01-01
The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…
Effects of quantum coherence on work statistics
Xu, Bao-Ming; Zou, Jian; Guo, Li-Sha; Kong, Xiang-Mu
2018-05-01
In the conventional two-point measurement scheme of quantum thermodynamics, quantum coherence is destroyed by the first measurement. But as we know the coherence really plays an important role in the quantum thermodynamics process, and how to describe the work statistics for a quantum coherent process is still an open question. In this paper, we use the full counting statistics method to investigate the effects of quantum coherence on work statistics. First, we give a general discussion and show that for a quantum coherent process, work statistics is very different from that of the two-point measurement scheme, specifically the average work is increased or decreased and the work fluctuation can be decreased by quantum coherence, which strongly depends on the relative phase, the energy level structure, and the external protocol. Then, we concretely consider a quenched one-dimensional transverse Ising model and show that quantum coherence has a more significant influence on work statistics in the ferromagnetism regime compared with that in the paramagnetism regime, so that due to the presence of quantum coherence the work statistics can exhibit the critical phenomenon even at high temperature.
Fractional statistics and fractional quantized Hall effect. Revision
International Nuclear Information System (INIS)
Tao, R.; Wu, Y.S.
1984-01-01
We suggest that the origin of the odd denominator rule observed in the fractional quantized Hall effect (FQHE) may lie in fractional statistics which governs quasiparticles in FQHE. A theorem concerning statistics of clusters of quasiparticles implies that fractional statistics does not allow coexistence of a large number of quasiparticles at fillings with an even denominator. Thus no Hall plateau can be formed at these fillings, regardless of the presence of an energy gap. 15 references
Energy Technology Data Exchange (ETDEWEB)
Ucak-Koc, A.
2014-06-01
The effects of altitude and beehive bottom board types (BBBT) on the wintering performance of honeybee colonies were investigated in the South Aegean Region of Turkey: Experiment I (E-I), with 32 colonies, in 2010-2011, and Experiment II (E-II), with 20 colonies, in 2011-2012. Each lowland (25 m) and highland (797 m) colony was divided randomly into two BBBT subgroups, open screen floor (OSF) and normal bottom floor (NBF), and wintered for about three months. In E-I, the local genotype Aegean ecotype of Anatolian bee (AE) and Italian race (ItR) were used, while in E-II, only the AE genotype was present. In E-I, the effect of wintering altitudes on the number of combs covered with bees (NCCB), and the effects of BBBT on brood area (BA) and the NCCB were found to be statistically significant (p < 0.05), but the effects of genotype on BA and NCCB were statistically insignificant (p > 0.05). In the E-II, the effect of wintering altitude on beehive weight was found to be statistically significant (p < 0.05), while its effect on the NCCB was statistically insignificant (p > 0.05). The wintering losses in the highland and lowland groups in E-I were determined to be 25% and 62.5% respectively. In contrast to this result, no loss was observed in E-II for both altitudes. In E-I, the wintering losses for both OSF and NBF groups were the same (43.75%). In conclusion, under subtropical climatic conditions, due to variations from year to year, honeybee colonies can be wintered more successfully in highland areas with OSF bottom board type. (Author)
The Declining Effects of OSHA Inspections on Manufacturing Injuries: 1979 to 1998
Wayne B. Gray; John Mendeloff
2002-01-01
This study examines the impact of OSHA inspections on injuries in manufacturing plants. The authors use the same model and some of the same plant-level data employed by several earlier studies that found large effects of OSHA inspections on injuries for 1979-85. These new estimates indicate that an OSHA inspection imposing a penalty reduced lost-workday injuries by about 19% in 1979-85, but that this effect fell to 11% in 1987-91, and to a statistically insignificant 1% in 1992-98. The author...
AN EXAMINATION OF THE LEVERAGE EFFECT IN THE ISE WITH STOCHASTIC VOLATILITY MODEL
Directory of Open Access Journals (Sweden)
YELİZ YALÇIN
2013-06-01
Full Text Available The purpose of this paper is the asses the leverage effect of the Istanbul Stock Exchange within the Stochastic Volatility framework in the period 01.01.1990 – 11.08.2006. The relationship between risk and return is a well established phenomenon in Financial Econometerics. Both positive and negative relationship has been reported in the empirical literature. That use the conditional variance the empirical evidence provided in this paper from the Stochastic Volatility is to be negative feed back effect and statistically insignificant leverage effect.
Non-extensive statistical effects in nuclear many-body problems
International Nuclear Information System (INIS)
Lavagno, A.; Quarati, P.
2007-01-01
Density and temperature conditions in many stellar core and in the first stage of relativistic heavy-ion collisions imply the presence of non-ideal plasma effects with memory and long-range interactions between particles. Recent progress in statistical mechanics indicates that Tsallis non-extensive thermostatistics could be the natural generalization of the standard classical and quantum statistics, when memory effects and long range forces are not negligible. In this framework, we show that in weakly non-ideal plasma non-extensive effects should be taken into account to derive the equilibrium distribution functions, the quantum fluctuations and correlations between the particles. The strong influence of these effects is discussed in the context of the solar plasma physics and in the high-energy nuclear-nuclear collision experiments. Although the deviation from Boltzmann-Gibbs statistics, in both cases, is very small, the stellar plasma and the hadronic gas are strongly influenced by the non-extensive feature and the discrepancies between experimental data and theoretical previsions are sensibly reduced. (authors)
Peripheral vascular effects on auscultatory blood pressure measurement.
Rabbany, S Y; Drzewiecki, G M; Noordergraaf, A
1993-01-01
Experiments were conducted to examine the accuracy of the conventional auscultatory method of blood pressure measurement. The influence of the physiologic state of the vascular system in the forearm distal to the site of Korotkoff sound recording and its impact on the precision of the measured blood pressure is discussed. The peripheral resistance in the arm distal to the cuff was changed noninvasively by heating and cooling effects and by induction of reactive hyperemia. All interventions were preceded by an investigation of their effect on central blood pressure to distinguish local effects from changes in central blood pressure. These interventions were sufficiently moderate to make their effect on central blood pressure, recorded in the other arm, statistically insignificant (i.e., changes in systolic [p cooling experiments was statistically significant (p < 0.001). Moreover, both measured systolic (p < 0.004) and diastolic (p < 0.001) pressure decreases during the reactive hyperemia experiments were statistically significant. The findings demonstrate that alteration in vascular state generates perplexing changes in blood pressure, hence confirming experimental observations by earlier investigators as well as predictions by our model studies.
Roy, Anindita Singha; Bandyopadhyay, Amit
2015-01-01
The present study was aimed at investigating the effects of sleep deprivation and dietary irregularities during Ramadan intermittent fasting (RIF) on selective fitness profile parameters in young untrained male Muslim individuals. 77 untrained Muslim men were recruited in the study. They were divided into the experimental group (EG; n=37, age: 22.62±1.77 years) and the control group (CG; n=40, age: 23.00±1.48 years). EG was undergoing RIF while CG abstained. Aerobic fitness, anaerobic capacity or high-intensity efforts (HIEs), agility, flexibility, vertical jump height and handgrip strength were measured on 8 separate occasions-15 days before RIF, 7 days before RIF, 1st day of RIF, 7th day of RIF, 15th day of RIF, 21st day of RIF, last day of RIF and 15 days after RIF. Aerobic fitness and HIE showed a significant difference (p<0.05) during RIF in EG. Agility and flexibility score showed a significant decrease in EG during RIF, whereas changes in the vertical jump score (VJT) and handgrip strength were statistically insignificant. Studied parameters showed an insignificant variation in CG during RIF. Aerobic fitness, HIEs, agility and flexibility showed a significant intergroup variation during different experimental trials. The present investigation revealed that RIF had adverse effects on aerobic fitness, HIEs, agility and flexibility of young untrained Muslims of Kolkata, India. VJT, waist-hip ratio and handgrip strength were not affected by RIF in the studied population. Mild but statistically insignificant reduction in body mass was also reflected after the mid-Ramadan week.
Kwak, Dong Shin; Lee, Oh Young; Lee, Kang Nyeong; Jun, Dae Won; Lee, Hang Lak; Yoon, Byung Chul; Choi, Ho Soon
2016-05-23
DA-6034 has anti-inflammatory activities and exhibits cytoprotective effects in acute gastric injury models. However, explanations for the protective effects of DA-6034 on intestinal permeability are limited. This study sought to investigate the effect of DA-6034 on intestinal permeability in an indomethacin-induced small intestinal injury model and its protective effect against small intestinal injury. Rats in the treatment group received DA-6034 from days 0 to 2 and indomethacin from days 1 to 2. Rats in the control group received indomethacin from days 1 to 2. On the fourth day, the small intestines were examined to compare the severity of inflammation. Intestinal permeability was evaluated by using fluorescein isothiocyanate-labeled dextran. Western blotting was performed to confirm the association between DA-6034 and the extracellular signal-regulated kinase (ERK) pathway. The inflammation scores in the treatment group were lower than those in the control group, but the difference was statistically insignificant. Hemorrhagic lesions in the treatment group were broader than those in the control group, but the difference was statistically insignificant. Intestinal permeability was lower in the treatment group than in the control group. DA-6034 enhanced extracellular signal-regulated kinase expression, and intestinal permeability was negatively correlated with ERK expression. DA-6034 may decrease intestinal permeability in an indomethacin-induced intestinal injury model via the ERK pathway.
Statistical analysis of AFM topographic images of self-assembled quantum dots
Energy Technology Data Exchange (ETDEWEB)
Sevriuk, V. A.; Brunkov, P. N., E-mail: brunkov@mail.ioffe.ru; Shalnev, I. V.; Gutkin, A. A.; Klimko, G. V.; Gronin, S. V.; Sorokin, S. V.; Konnikov, S. G. [Russian Academy of Sciences, Ioffe Physical-Technical Institute (Russian Federation)
2013-07-15
To obtain statistical data on quantum-dot sizes, AFM topographic images of the substrate on which the dots under study are grown are analyzed. Due to the nonideality of the substrate containing height differences on the order of the size of nanoparticles at distances of 1-10 {mu}m and the insufficient resolution of closely arranged dots due to the finite curvature radius of the AFM probe, automation of the statistical analysis of their large dot array requires special techniques for processing topographic images to eliminate the loss of a particle fraction during conventional processing. As such a technique, convolution of the initial matrix of the AFM image with a specially selected matrix is used. This makes it possible to determine the position of each nanoparticle and, using the initial matrix, to measure their geometrical parameters. The results of statistical analysis by this method of self-assembled InAs quantum dots formed on the surface of an AlGaAs epitaxial layer are presented. It is shown that their concentration, average size, and half-width of height distribution depend strongly on the In flow and total amount of deposited InAs which are varied within insignificant limits.
Statistical projection effects in a hydrodynamic pilot-wave system
Sáenz, Pedro J.; Cristea-Platon, Tudor; Bush, John W. M.
2018-03-01
Millimetric liquid droplets can walk across the surface of a vibrating fluid bath, self-propelled through a resonant interaction with their own guiding or `pilot' wave fields. These walking droplets, or `walkers', exhibit several features previously thought to be peculiar to the microscopic, quantum realm. In particular, walkers confined to circular corrals manifest a wave-like statistical behaviour reminiscent of that of electrons in quantum corrals. Here we demonstrate that localized topological inhomogeneities in an elliptical corral may lead to resonant projection effects in the walker's statistics similar to those reported in quantum corrals. Specifically, we show that a submerged circular well may drive the walker to excite specific eigenmodes in the bath that result in drastic changes in the particle's statistical behaviour. The well tends to attract the walker, leading to a local peak in the walker's position histogram. By placing the well at one of the foci, a mode with maxima near the foci is preferentially excited, leading to a projection effect in the walker's position histogram towards the empty focus, an effect strongly reminiscent of the quantum mirage. Finally, we demonstrate that the mean pilot-wave field has the same form as the histogram describing the walker's statistics.
Effect size, confidence intervals and statistical power in psychological research.
Directory of Open Access Journals (Sweden)
Téllez A.
2015-07-01
Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.
Mavukkandy, Musthafa Odayooth; Karmakar, Subhankar; Harikumar, P S
2014-09-01
The establishment of an efficient surface water quality monitoring (WQM) network is a critical component in the assessment, restoration and protection of river water quality. A periodic evaluation of monitoring network is mandatory to ensure effective data collection and possible redesigning of existing network in a river catchment. In this study, the efficacy and appropriateness of existing water quality monitoring network in the Kabbini River basin of Kerala, India is presented. Significant multivariate statistical techniques like principal component analysis (PCA) and principal factor analysis (PFA) have been employed to evaluate the efficiency of the surface water quality monitoring network with monitoring stations as the evaluated variables for the interpretation of complex data matrix of the river basin. The main objective is to identify significant monitoring stations that must essentially be included in assessing annual and seasonal variations of river water quality. Moreover, the significance of seasonal redesign of the monitoring network was also investigated to capture valuable information on water quality from the network. Results identified few monitoring stations as insignificant in explaining the annual variance of the dataset. Moreover, the seasonal redesign of the monitoring network through a multivariate statistical framework was found to capture valuable information from the system, thus making the network more efficient. Cluster analysis (CA) classified the sampling sites into different groups based on similarity in water quality characteristics. The PCA/PFA identified significant latent factors standing for different pollution sources such as organic pollution, industrial pollution, diffuse pollution and faecal contamination. Thus, the present study illustrates that various multivariate statistical techniques can be effectively employed in sustainable management of water resources. The effectiveness of existing river water quality monitoring
An R2 statistic for fixed effects in the linear mixed model.
Edwards, Lloyd J; Muller, Keith E; Wolfinger, Russell D; Qaqish, Bahjat F; Schabenberger, Oliver
2008-12-20
Statisticians most often use the linear mixed model to analyze Gaussian longitudinal data. The value and familiarity of the R(2) statistic in the linear univariate model naturally creates great interest in extending it to the linear mixed model. We define and describe how to compute a model R(2) statistic for the linear mixed model by using only a single model. The proposed R(2) statistic measures multivariate association between the repeated outcomes and the fixed effects in the linear mixed model. The R(2) statistic arises as a 1-1 function of an appropriate F statistic for testing all fixed effects (except typically the intercept) in a full model. The statistic compares the full model with a null model with all fixed effects deleted (except typically the intercept) while retaining exactly the same covariance structure. Furthermore, the R(2) statistic leads immediately to a natural definition of a partial R(2) statistic. A mixed model in which ethnicity gives a very small p-value as a longitudinal predictor of blood pressure (BP) compellingly illustrates the value of the statistic. In sharp contrast to the extreme p-value, a very small R(2) , a measure of statistical and scientific importance, indicates that ethnicity has an almost negligible association with the repeated BP outcomes for the study.
El-Basyouny, Karim; Barua, Sudip; Islam, Md Tazul
2014-12-01
Previous research shows that various weather elements have significant effects on crash occurrence and risk; however, little is known about how these elements affect different crash types. Consequently, this study investigates the impact of weather elements and sudden extreme snow or rain weather changes on crash type. Multivariate models were used for seven crash types using five years of daily weather and crash data collected for the entire City of Edmonton. In addition, the yearly trend and random variation of parameters across the years were analyzed by using four different modeling formulations. The proposed models were estimated in a full Bayesian context via Markov Chain Monte Carlo simulation. The multivariate Poisson lognormal model with yearly varying coefficients provided the best fit for the data according to Deviance Information Criteria. Overall, results showed that temperature and snowfall were statistically significant with intuitive signs (crashes decrease with increasing temperature; crashes increase as snowfall intensity increases) for all crash types, while rainfall was mostly insignificant. Previous snow showed mixed results, being statistically significant and positively related to certain crash types, while negatively related or insignificant in other cases. Maximum wind gust speed was found mostly insignificant with a few exceptions that were positively related to crash type. Major snow or rain events following a dry weather condition were highly significant and positively related to three crash types: Follow-Too-Close, Stop-Sign-Violation, and Ran-Off-Road crashes. The day-of-the-week dummy variables were statistically significant, indicating a possible weekly variation in exposure. Transportation authorities might use the above results to improve road safety by providing drivers with information regarding the risk of certain crash types for a particular weather condition. Copyright © 2014 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Yamamoto, Yoshinobu; Kunugi, Tomoaki; Serizawa, Akimi
2002-01-01
In this study, investigation on effects of high frequency fluctuations on DNS of turbulent open-channel flows with high Pr passive scalar transport was conducted. As the results, although significant differences of energy spectra behaviors in temperature fields, are caused at high wave number region where insignificant area for velocity components, large difference dose not caused in mean and statistic behaviors in temperature component. But, if the buoyancy were considered, this temperature high-frequency fluctuations would be greatly changed mean and statistics behaviors from the difference of the accuracy and resolution at high wave number region. (author)
Statistical identification of effective input variables
International Nuclear Information System (INIS)
Vaurio, J.K.
1982-09-01
A statistical sensitivity analysis procedure has been developed for ranking the input data of large computer codes in the order of sensitivity-importance. The method is economical for large codes with many input variables, since it uses a relatively small number of computer runs. No prior judgemental elimination of input variables is needed. The sceening method is based on stagewise correlation and extensive regression analysis of output values calculated with selected input value combinations. The regression process deals with multivariate nonlinear functions, and statistical tests are also available for identifying input variables that contribute to threshold effects, i.e., discontinuities in the output variables. A computer code SCREEN has been developed for implementing the screening techniques. The efficiency has been demonstrated by several examples and applied to a fast reactor safety analysis code (Venus-II). However, the methods and the coding are general and not limited to such applications
Effects of Hydrological Parameters on Palm Oil Fresh Fruit Bunch Yield)
Nda, M.; Adnan, M. S.; Suhadak, M. A.; Zakaria, M. S.; Lopa, R. T.
2018-04-01
Climate change effects and variability have been studied by many researchers in diverse geophysical fields. Malaysia produces large volume of palm oil, the effects of climate change on hydrological parameters (rainfall and precipitation) could have adverse effects on palm oil fresh fruit bunch (FFB) production with implications at both local and international market. It is important to understand the effects of climate change on crop yield to adopt new cultivation techniques and guaranteeing food security globally. Based on this background, the paper’s objective is to investigate the effects of rainfall and temperature pattern on crop yield (FFB) within five years period (2013 - 2017) at Batu Pahat District. The Man - Kendall rank technique (trend test) and statistical analyses (correlation and regression) were applied to the dataset used for the study. The results reveal that there are variabilities in rainfall and temperature from one month to the other and the statistical analysis reveals that the hydrological parameters have an insignificant effect on crop yield.
Effective control of complex turbulent dynamical systems through statistical functionals.
Majda, Andrew J; Qi, Di
2017-05-30
Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.
Statistical Study of Transformation Changes in the Ukrainian Economy
Directory of Open Access Journals (Sweden)
O. V.
2017-12-01
Full Text Available The article deals with the economic diagnostics of some important macroeconomic indicators of Ukraine that will reveal the nature and speed of the economic transformation. During the period of 2003–2007, the Ukrainian economy grew at an impressive pace. However, at present, the country is undergoing a period of serious trials, it needs to address structural problems that endanger long-term economic growth. The way out of the current situation should be the realization of the potential for growth of advanced sectors and increase of productivity across the national economy. Special attention should be paid to the transition from extractive institutions to inclusive ones. Key factors in accelerating the Ukrainian economy are more vigorous fight against corruption and investment attraction. A set of institutional variables is proposed, which allows for a more thorough assessment of the nature of economic transformation in Ukraine and detection of such deviations – transformation of the national economy occurs at different speeds. Along with the traditional shifts in the structure of GDP (the dominating share of services, there’s still insignificant statistical effect of such important institutional categories as the level of political globalization, the control of corruption, the level of property rights protection, the rule of law, and the level of social globalization.
Aspects of statistical spectroscopy relevant to effective-interaction theory
International Nuclear Information System (INIS)
French, J.B.
1975-01-01
The three aspects of statistical spectroscopy discussed in this paper are the information content of complex spectra: procedures for spectroscopy in huge model spaces, useful in effective-interaction theory; and practical ways of identifying and calculating measurable parameters of the effective Hamiltonian and other operators, and of comparing different effective Hamiltonians. (4 figures) (U.S.)
Directory of Open Access Journals (Sweden)
Mustafa Karakaya
1996-01-01
Full Text Available This research was conducted on the laboratory conditions. Different levels of K2HP04 (0.00%, 0.25%, 0.30% and NaCl (2.5%, 3.0% were applied on the goat meat and the pH, water holding capacity and cooking loss were observed. According to the results the effects of 0.25% K2HP04 addition was found statistically insignificant (p
New selection effect of statistical investigations of supernova remnants
International Nuclear Information System (INIS)
Allakhverdiev, A.O.; Gusejnov, O.Kh.; Kasumov, F.K.
1986-01-01
The influence of H2 regions on the parameters of Supernova remnants (SNR) is investigated. It has been shown that the projection of such regions on the SNRs leads to local changes of morphological structure of young shell-type SNRs and considerable distortions of integral parameters of evolved shell-type SNRs (with D > 10 pc) and plerions, up to their complete undetectability on the background of classical and gigantic H2 regions. A new selection effect, in fact, arises from these factors connected with additional limitations made by the real structure of the interstellar medium on the statistical investigations of SNRs. The influence of this effect on the statistical completeness of objects has been estimated
New selection effect in statistical investigations of supernova remnants
Allakhverdiev, A. O.; Guseinov, O. Kh.; Kasumov, F. K.
1986-01-01
The influence of H II regions on the parameters of supernova remnants (SNR) is investigated. It has been shown that the projection of such regions on the SNRs leads to: a) local changes of morphological structure of young shell-type SNRs and b) considerable distortions of integral parameters of evolved shell-type SNRs (with D > 10 pc) and plerions, up to their complete undetectability on the background of classical and gigantic H II regions. A new selection effect, in fact, arises from these factors connected with additional limitations made by the real structure of the interstellar medium on the statistical investigations of SNRs. The influence of this effect on the statistical completeness of objects has been estimated.
Effect of crack orientation statistics on effective stiffness of mircocracked solid
DEFF Research Database (Denmark)
Kushch, V.I.; Sevostianov, I.; Mishnaevsky, Leon
2009-01-01
provides reducing the boundary-value problem to an ordinary, well-posed set of linear algebraic equations. The exact finite form expression of the effective stiffness tensor has been obtained by analytical averaging the strain and stress fields. The convergence study has been performed: the statistically...
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Effect of model choice and sample size on statistical tolerance limits
International Nuclear Information System (INIS)
Duran, B.S.; Campbell, K.
1980-03-01
Statistical tolerance limits are estimates of large (or small) quantiles of a distribution, quantities which are very sensitive to the shape of the tail of the distribution. The exact nature of this tail behavior cannot be ascertained brom small samples, so statistical tolerance limits are frequently computed using a statistical model chosen on the basis of theoretical considerations or prior experience with similar populations. This report illustrates the effects of such choices on the computations
Visualizing Statistical Mix Effects and Simpson's Paradox.
Armstrong, Zan; Wattenberg, Martin
2014-12-01
We discuss how "mix effects" can surprise users of visualizations and potentially lead them to incorrect conclusions. This statistical issue (also known as "omitted variable bias" or, in extreme cases, as "Simpson's paradox") is widespread and can affect any visualization in which the quantity of interest is an aggregated value such as a weighted sum or average. Our first contribution is to document how mix effects can be a serious issue for visualizations, and we analyze how mix effects can cause problems in a variety of popular visualization techniques, from bar charts to treemaps. Our second contribution is a new technique, the "comet chart," that is meant to ameliorate some of these issues.
Empirical and Statistical Evaluation of the Effectiveness of Four ...
African Journals Online (AJOL)
Akorede
ABSTRACT: Data compression is the process of reducing the size of a file to effectively ... Through the statistical analysis performed using Boxplot and ANOVA and comparison made ...... Automatic Control, Electronics and Computer Science.
Mask effects on cosmological studies with weak-lensing peak statistics
International Nuclear Information System (INIS)
Liu, Xiangkun; Pan, Chuzhong; Fan, Zuhui; Wang, Qiao
2014-01-01
With numerical simulations, we analyze in detail how the bad data removal, i.e., the mask effect, can influence the peak statistics of the weak-lensing convergence field reconstructed from the shear measurement of background galaxies. It is found that high peak fractions are systematically enhanced because of the presence of masks; the larger the masked area is, the higher the enhancement is. In the case where the total masked area is about 13% of the survey area, the fraction of peaks with signal-to-noise ratio ν ≥ 3 is ∼11% of the total number of peaks, compared with ∼7% of the mask-free case in our considered cosmological model. This can have significant effects on cosmological studies with weak-lensing convergence peak statistics, inducing a large bias in the parameter constraints if the effects are not taken into account properly. Even for a survey area of 9 deg 2 , the bias in (Ω m , σ 8 ) is already intolerably large and close to 3σ. It is noted that most of the affected peaks are close to the masked regions. Therefore, excluding peaks in those regions in the peak statistics can reduce the bias effect but at the expense of losing usable survey areas. Further investigations find that the enhancement of the number of high peaks around the masked regions can be largely attributed to the smaller number of galaxies usable in the weak-lensing convergence reconstruction, leading to higher noise than that of the areas away from the masks. We thus develop a model in which we exclude only those very large masks with radius larger than 3' but keep all the other masked regions in peak counting statistics. For the remaining part, we treat the areas close to and away from the masked regions separately with different noise levels. It is shown that this two-noise-level model can account for the mask effect on peak statistics very well, and the bias in cosmological parameters is significantly reduced if this model is applied in the parameter fitting.
[Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].
Suzukawa, Yumi; Toyoda, Hideki
2012-04-01
This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.
Directory of Open Access Journals (Sweden)
Ujjwal Maulik
Full Text Available Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution. The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post
Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra
2015-01-01
Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data
Directory of Open Access Journals (Sweden)
R. Eric Heidel
2016-01-01
Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.
Effective viscosity of dispersions approached by a statistical continuum method
Mellema, J.; Willemse, M.W.M.
1983-01-01
The problem of the determination of the effective viscosity of disperse systems (emulsions, suspensions) is considered. On the basis of the formal solution of the equations governing creeping flow in a statistically homogeneous dispersion, the effective viscosity is expressed in a series expansion
The Effect of Immunocastration on Some Meat Quality Characteristics
Directory of Open Access Journals (Sweden)
Ioana Andronie
2015-05-01
Full Text Available Surgical castration of male pigs has become increasingly less accepted at present due to the fact that it is a direct animal welfare concern. UE welfare conditions demand that this method be dropped starting 2018 in order to eliminate the stress and associated pain it induces. Immunocastration is one of the alternatives to surgical castration that ensures male pig welfare and eliminates the boar smell of the meat. The objective of this research has been to identify the immunocastration effects on meat quality, compared with the meat from surgically castrated pigs. The animals assessed during research were fattening PIC pigs, grouped into two lots: surgically castrated pigs (SC and immunocastrated pigs (IC. Pig immunization was achieved by means of ImprovacTM. The results have shown that immunocastrated pigs recorded a 60.2% carcass meat compared to the 59.69% carcass meat achieved in surgically castrated pigs, thus displaying statistically insignificant differences (p≥0.05. Fat layer thickness was significantly lower in immunocastrated pigs compared to the marker lot (p≤0.05. Mycoplasma hyopneumoniae lung lesions recorded were lower in immunocastrated pigs compared to the surgically castrated animals (p≥0.05. Despite the fact that differences have been statistically insignificant, we are able to confirm that immunocastrated pigs display a lower incidence of respiratory disease compared to surgically castrated pigs. In this study there were not significant differences in meat quality between surgically castrated pigs and immunocastrated pigs.
Effects of Climate Change on the Yield and Cropping Area of Major Food Crops: A Case of Bangladesh
Directory of Open Access Journals (Sweden)
Md. Ruhul Amin
2015-01-01
Full Text Available The crops that we grow for food need specific climatic conditions to show better performance in view of economic yield. A changing climate could have both beneficial and harmful effects on crops. Keeping the above view in mind, this study is undertaken to investigate the impacts of climate change (viz. changes in maximum temperature, minimum temperature, rainfall, humidity and sunshine on the yield and cropping area of four major food crops (viz. Aus rice, Aman rice, Boro rice and wheat in Bangladesh. Heteroskedasticity and autocorrelation consistent standard error (HAC and feasible generalized least square (FGLS methods were used to determine the climate-crop interrelations using national level time series data for the period of 1972–2010. Findings revealed that the effects of all the climate variables have had significant contributions to the yield and cropping area of major food crops with distinct variation among them. Maximum temperature statistically significantly affected all the food crops’ yield except Aus rice. Maximum temperature also insignificantly affected cropping area of all the crops. Minimum temperature insignificantly affected Aman rice but benefited other three crops’ yield and cropping area. Rainfall significantly benefitted cropping area of Aus rice, but significantly affected both yield and cropping area of Aman rice. Humidity statistically positively contributed to the yield of Aus and Aman rice but, statistically, negatively influenced the cropping area of Aus rice. Sunshine statistically significantly benefitted only Boro rice yield. Overall, maximum temperature adversely affected yield and cropping area of all the major food crops and rainfall severely affected Aman rice only. Concerning the issue of climate change and ensuring food security, the respective authorities thus should give considerable attention to the generation, development and extension of drought (all major food crops and flood (particularly Aman
Wolf, Elke; Zwick, Thomas
2002-01-01
"Employee participation in the capital or profit of the firm is regarded as a suitable way to increase labour productivity if the employees' performance can not be monitored directly. Nonetheless employee participation in asset formation is currently used by few German firms. In this paper we show that the direct effect on productivity of employee participation in asset formation is only small and is furthermore statistically insignificant. The study was conducted using the representative dat...
Statistical effect of interactions on particle creation in expanding universe
International Nuclear Information System (INIS)
Kodama, Hideo
1982-01-01
The statistical effect of interactions which drives many-particle systems toward equilibrium is expected to change the qualitative and quantitative features of particle creation in expanding universe. To investigate this problem a simplified model called the finite-time reduction model is formulated and applied to the scalar particle creation in the radiation dominant Friedmann universe. The number density of created particles and the entropy production due to particle creation are estimated. The result for the number density is compared with that in the conventional free field theory. It is shown that the statistical effect increases the particle creation and lengthens the active creation period. As for the entropy production it is shown that it is negligible for scalar particles in the Friedmann universe. (author)
Study of energy fluctuation effect on the statistical mechanics of equilibrium systems
International Nuclear Information System (INIS)
Lysogorskiy, Yu V; Wang, Q A; Tayurskii, D A
2012-01-01
This work is devoted to the modeling of energy fluctuation effect on the behavior of small classical thermodynamic systems. It is known that when an equilibrium system gets smaller and smaller, one of the major quantities that becomes more and more uncertain is its internal energy. These increasing fluctuations can considerably modify the original statistics. The present model considers the effect of such energy fluctuations and is based on an overlapping between the Boltzmann-Gibbs statistics and the statistics of the fluctuation. Within this o verlap statistics , we studied the effects of several types of energy fluctuations on the probability distribution, internal energy and heat capacity. It was shown that the fluctuations can considerably change the temperature dependence of internal energy and heat capacity in the low energy range and at low temperatures. Particularly, it was found that, due to the lower energy limit of the systems, the fluctuations reduce the probability for the low energy states close to the lowest energy and increase the total average energy. This energy increasing is larger for lower temperatures, making negative heat capacity possible for this case.
Study of the effects of photoelectron statistics on Thomson scattering data
International Nuclear Information System (INIS)
Hart, G.W.; Levinton, F.M.; McNeill, D.H.
1986-01-01
A computer code has been developed which simulates a Thomson scattering measurement, from the counting statistics of the input channels through the mathematical analysis of the data. The scattered and background signals in each of the wavelength channels are assumed to obey Poisson statistics, and the spectral data are fitted to a Gaussian curve using a nonlinear least-squares fitting algorithm. This method goes beyond the usual calculation of the signal-to-noise ratio for the hardware and gives a quantitative measure of the effect of the noise on the final measurement. This method is applicable to Thomson scattering measurements in which the signal-to-noise ratio is low due to either low signal or high background. Thomson scattering data from the S-1 spheromak have been compared to this simulation, and they have been found to be in good agreement. This code has proven to be useful in assessing the effects of counting statistics relative to shot-to-shot variability in producing the observed spread in the data. It was also useful for designing improvements for the S-1 Thomson scattering system, and this method would be applicable to any measurement affected by counting statistics
Directory of Open Access Journals (Sweden)
Ramya Krishnamurthy
2015-06-01
Full Text Available To evaluate the safety and efficacy of Isosorbide mononitrate (IMN as a cervical ripening agent prior to induction of labour in term pregnant women.A randomized placebo-controlled study was conducted on 100 term singleton pregnancies planned for induction of labour. The participants were randomly assigned to two groups. One group received 40 mg IMN and the other group received 40mg of placebo kept vaginally. The main outcome of this study was to evaluate the efficacy of IMN in cervical ripening based on the change in modified Bishop score and the effect on time duration between the drug insertion and delivery. Safety of isosorbide mononitrate was assessed by measuring variables related to maternal and neonatal outcomes.Baseline demographic characteristics were similar in both groups. The mean change in modified Bishop score after 2 doses of 40mg IMN was insignificant when compared to placebo. Though IMN shortened the time duration between the drug insertion to delivery when compared to placebo, it was statistically insignificant. The need for oxytocin and 2(nd ripening agent was less in IMN group when compared to placebo group but statistically this also proved to be insignificant. It was noted that there was an increase in caesarean deliveries in IMN than in placebo group. IMN did not cause any significant change in maternal hemodynamics and adverse side effects. Though NICU admission and stay was less in IMN than in placebo group, it was statistically insignificant.Though IMN did not cause any maternal and neonatal adverse effects, it was found to be inefficient in comparison to placebo as a cervical ripening agent.
Statistical equilibrium calculations for silicon in early-type model stellar atmospheres
International Nuclear Information System (INIS)
Kamp, L.W.
1976-02-01
Line profiles of 36 multiplets of silicon (Si) II, III, and IV were computed for a grid of model atmospheres covering the range from 15,000 to 35,000 K in effective temperature and 2.5 to 4.5 in log (gravity). The computations involved simultaneous solution of the steady-state statistical equilibrium equations for the populations and of the equation of radiative transfer in the lines. The variables were linearized, and successive corrections were computed until a minimal accuracy of 1/1000 in the line intensities was reached. The common assumption of local thermodynamic equilibrium (LTE) was dropped. The model atmospheres used also were computed by non-LTE methods. Some effects that were incorporated into the calculations were the depression of the continuum by free electrons, hydrogen and ionized helium line blocking, and auto-ionization and dielectronic recombination, which later were found to be insignificant. Use of radiation damping and detailed electron (quadratic Stark) damping constants had small but significant effects on the strong resonance lines of Si III and IV. For weak and intermediate-strength lines, large differences with respect to LTE computations, the results of which are also presented, were found in line shapes and strengths. For the strong lines the differences are generally small, except for the models at the hot, low-gravity extreme of the range. These computations should be useful in the interpretation of the spectra of stars in the spectral range B0--B5, luminosity classes III, IV, and V
Electron Dropout Echoes Induced by Interplanetary Shock: A Statistical Study
Liu, Z.; Zong, Q.; Hao, Y.; Zhou, X.; Ma, X.; Liu, Y.
2017-12-01
"Electron dropout echo" as indicated by repeated moderate dropout and recovery signatures of the flux of energetic electron in the out radiation belt region has been investigated systematically. The electron dropout and its echoes are usually found for higher energy (> 300 keV) channels fluxes, whereas the flux enhancements are obvious for lower energy electrons simultaneously after the interplanetary shock arrives at the Earth's geosynchronous orbit. 104 dropout echo events have been found from 215 interplanetary shock events from 1998 to 2007 based on LANL satellite data. In analogy to substorm injections, these 104 events could be naturally divided into two categories: dispersionless (49 events) or dispersive (55 events) according to the energy dispersion of the initial dropout. It is found that locations of dispersionless events are distributed mainly in the duskside magnetosphere. Further, the obtained locations derived from dispersive events with the time-of-flight technique of the initial dropout regions are mainly located at the duskside as well. Statistical studies have shown that the effect of shock normal, interplanetary magnetic field Bz and solar wind dynamic pressure may be insignificant to these electron dropout events. We suggest that the electric field impulse induced by the IP shock produces a more pronounced inward migration of electrons at the dusk side, resulting in the observed dusk-side moderate dropout of electron flux and its consequent echoes.
Reporting effect sizes as a supplement to statistical significance ...
African Journals Online (AJOL)
The purpose of the article is to review the statistical significance reporting practices in reading instruction studies and to provide guidelines for when to calculate and report effect sizes in educational research. A review of six readily accessible (online) and accredited journals publishing research on reading instruction ...
The Insignificance of Major Mergers in Driving Star Formation at z approximately equal to 2
Kaviraj, S.; Cohen, S.; Windhorst, R. A.; Silk, J.; O'Connell, R. W.; Dopita, M. A.; Dekel, A.; Hathi, N. P.; Straughn, A.; Rutkowski, M.
2012-01-01
We study the significance of major mergers in driving star formation in the early Universe, by quantifying the contribution of this process to the total star formation budget in 80 massive (M(*) > 10(exp 10) Solar M) galaxies at z approx = 2. Employing visually-classified morphologies from rest-frame V-band HST imaging, we find that 55(exp +/-14)% of the star formation budget is hosted by non-interacting late-types, with 27(exp +/-18% in major mergers and 18(exp +/- 6)% in spheroids. Given that a system undergoing a major merger continues to experience star formation driven by other processes at this epoch (e.g. cold accretion, minor mergers), approx 27% is a likely upper limit for the major-merger contribution to star formation activity at this epoch. The ratio of the average specific star formation rate in major mergers to that in the non-interacting late-types is approx 2.2:1, suggesting that the typical enhancement of star formation due to major merging is modest and that just under half the star formation in systems experiencing major mergers is unrelated to the merger itself. Taking this into account, we estimate that the actual major-merger contribution to the star formation budget may be as low as approx 15%. While our study does not preclude a major-merger-dominated. era in the very early Universe, if the major-merger contribution to star formation does not evolve significantly into larger look-back times, then this process has a relatively insignificant role in driving stellar mass assembly over cosmic time.
Directory of Open Access Journals (Sweden)
Christianto V.
2007-04-01
Full Text Available In the light of some recent hypotheses suggesting plausible unification of thermostatistics where Fermi-Dirac, Bose-Einstein and Tsallis statistics become its special subsets, we consider further plausible extension to include non-integer Hausdorff dimension, which becomes realization of fractal entropy concept. In the subsequent section, we also discuss plausible extension of this unified statistics to include anisotropic effect by using quaternion oscillator, which may be observed in the context of Cosmic Microwave Background Radiation. Further observation is of course recommended in order to refute or verify this proposition.
Lee, Sang Bok; Shin, Hye Sook
2007-10-01
The purpose of this study was to examine the effects of Kangaroo Care(KC) on anxiety, maternal role confidence, and maternal infant attachment of mothers who delivered preterm infants. The research design was a nonequivalent control group pretest-posttest. Data was collected from September 1. 2006 to June 20. 2007. The participants were 22 mothers in the experimental group and 21 in the control group. KC was applied three times per day, for a total of ten times in 4 days to the experimental group. The degree of anxiety was statistically significantly different between the two groups but maternal role confidence and maternal infant attachment was statistically insignificant. This data suggests that KC was effective for mothers anxiety relief but it was not effective for maternal role confidence and maternal infant attachment of mothers. The implications for nursing practice and directions for future research need to be discussed.
An economic perspective on experience curves and dynamic economies in renewable energy technologies
International Nuclear Information System (INIS)
Papineau, Maya
2006-01-01
This paper analyzes dynamic economies in renewable energy technologies. The paper has two contributions. The first is to test the robustness of experience in solar photovoltaic, solar thermal and wind energy to the addition of an explicit time trend, which has been done in experience studies for other industries, but not for renewable energy technologies. Estimation is carried out on the assumption that cumulative capacity, industry production, average firm production, and electricity generation affect experience and thus the fall in price. The second contribution is to test the impact of R and D on price reduction. In general cumulative experience is found to be highly statistically significant when estimated alone, and highly statistically insignificant when time is added to the model. The effect of R and D is small and statistically significant in solar photovoltaic technology and statistically insignificant in solar thermal and wind technologies
Age-Related Decline in Spelling Ability: A Link with Fluid Intelligence?
Stuart-Hamilton, Ian; Rabbitt, Patrick
1997-01-01
On spelling tests taken by 159 adults over 50, younger subjects had significantly higher scores. Statistically removing effects of crystallized intelligence and education had no effect, but removing effects of fluid intelligence made the difference insignificant. Although spelling is considered a crystallized skill, in older people it may rely…
The Effect of Using Case Studies in Business Statistics
Pariseau, Susan E.; Kezim, Boualem
2007-01-01
The authors evaluated the effect on learning of using case studies in business statistics courses. The authors divided students into 3 groups: a control group, a group that completed 1 case study, and a group that completed 3 case studies. Results evidenced that, on average, students whom the authors required to complete a case analysis received…
THE EFFECT OF MACROECONOMIC VARIABLES ON STOCK RETURNS ON DHAKA STOCK EXCHANGE
Directory of Open Access Journals (Sweden)
Muhammed Monjurul Quadir
2012-01-01
Full Text Available This article investigates the effects of macroeconomic variables of treasury bill interest rate and industrial production on stock returns on Dhaka Stock Exchange for the period between January 2000 and February 2007 on the basis of monthly time series data using Autoregressive Integrated Moving Average (ARIMA model. The paper has taken the overall market stock returns as an independent variable. It does not consider the stock returns of different companies separately. Though the ARIMA model finds a positive relationship between Treasury bill interest rate and industrial production with market stock returns but the coefficients have turned out to be statistically insignificant.
Instrumental Effects of Fiscal Policy for Pakistan Economy
Directory of Open Access Journals (Sweden)
Ghulam Rasool Madni
2013-12-01
Full Text Available Fiscal policy has much controversial debate regarding its effectiveness on economic growth. Taxation and government expenditure are two main instruments of fiscal policy. This paper is aimed to analyze the effect of different categories of government expenditure on economic growth of Pakistan. Based on impact on economic growth, government expenditures are classified into productive (having positive or neutral effect on economic growth and unproductive expenditures (having negative or insignificant impact on economic growth. The data time span for this study is 1979-2012. After classification of expenditures, the impact of fiscal instruments is analyzed by utilizing the ARDL approach of Co integration which is a better estimation technique for small sample size. The results reveal that unproductive government expenditure have negative impact while productive government expenditure has insignificant impact on the economic growth. It is found that private investment positively and significantly affect the economic growth. On the other side, direct and indirect taxes have also insignificant impact on economic growth of Pakistan
Death Certification Errors and the Effect on Mortality Statistics.
McGivern, Lauri; Shulman, Leanne; Carney, Jan K; Shapiro, Steven; Bundock, Elizabeth
Errors in cause and manner of death on death certificates are common and affect families, mortality statistics, and public health research. The primary objective of this study was to characterize errors in the cause and manner of death on death certificates completed by non-Medical Examiners. A secondary objective was to determine the effects of errors on national mortality statistics. We retrospectively compared 601 death certificates completed between July 1, 2015, and January 31, 2016, from the Vermont Electronic Death Registration System with clinical summaries from medical records. Medical Examiners, blinded to original certificates, reviewed summaries, generated mock certificates, and compared mock certificates with original certificates. They then graded errors using a scale from 1 to 4 (higher numbers indicated increased impact on interpretation of the cause) to determine the prevalence of minor and major errors. They also compared International Classification of Diseases, 10th Revision (ICD-10) codes on original certificates with those on mock certificates. Of 601 original death certificates, 319 (53%) had errors; 305 (51%) had major errors; and 59 (10%) had minor errors. We found no significant differences by certifier type (physician vs nonphysician). We did find significant differences in major errors in place of death ( P statistics. Surveillance and certifier education must expand beyond local and state efforts. Simplifying and standardizing underlying literal text for cause of death may improve accuracy, decrease coding errors, and improve national mortality statistics.
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial
Statistical Assessment of the Effectiveness of Transformation Change (by Case of Singapore
Directory of Open Access Journals (Sweden)
Zhuravlyov
2017-02-01
Full Text Available In studies of economic transformations and their statistical assessment, the causality of processes specific to economic relations and development of institutions is overlooked. The article is devoted to the important topic of statistical assessment of the transformations effectiveness. The case of Singapore is taken because it is an Asian country demonstrating the essential role of the institutional environment in the national economy transformations. The regression analysis of the impact of institutional factors on economic growth in Singapore is made using 17 indicators: civil freedoms, corruption, economic freedom, economic globalization, spending on education, use of energy, share of women at labor market, fiscal freedom, price for fuel, PPP, effectiveness of public administration, level of consumption, Human Development Index, Internet users, life expectancy, unemployment, openness of trade. Economic interpretation of the statistical assessment of economic transformations in Singapore is as follows: quality of the institutional environment (control of corruption, economic freedom, supremacy of law etc. has critical importance for economic development in Singapore; the increasing spending on education has positive effects for economic growth in Singapore; economic growth in Singapore has high positive correlation with energy consumption.
Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects
Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian;
2015-01-01
Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex
Effects of a Flexibility/Support Intervention on Work Performance: Evidence From the Work, Family, and Health Network.
Bray, Jeremy W; Hinde, Jesse M; Kaiser, David J; Mills, Michael J; Karuntzos, Georgia T; Genadek, Katie R; Kelly, Erin L; Kossek, Ellen E; Hurtado, David A
2018-05-01
To estimate the effects of a workplace initiative to reduce work-family conflict on employee performance. A group-randomized multisite controlled experimental study with longitudinal follow-up. An information technology firm. Employees randomized to the intervention (n = 348) and control condition (n = 345). An intervention, "Start. Transform. Achieve. Results." to enhance employees' control over their work time, to increase supervisors' support for this change, and to increase employees' and supervisors' focus on results. We estimated the effect of the intervention on 9 self-reported employee performance measures using a difference-in-differences approach with generalized linear mixed models. Performance measures included actual and expected hours worked, absenteeism, and presenteeism. This study found little evidence that an intervention targeting work-family conflict affected employee performance. The only significant effect of the intervention was an approximately 1-hour reduction in expected work hours. After Bonferroni correction, the intervention effect is marginally insignificant at 6 months and marginally significant at 12 and 18 months. The intervention reduced expected working time by 1 hour per week; effects on most other employee self-reported performance measures were statistically insignificant. When coupled with the other positive wellness and firm outcomes, this intervention may be useful for improving employee perceptions of increased access to personal time or personal wellness without sacrificing performance. The null effects on performance provide countervailing evidence to recent negative press on work-family and flex work initiatives.
Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)
International Nuclear Information System (INIS)
2003-01-01
This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas
Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)
International Nuclear Information System (INIS)
2004-01-01
This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas
Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)
International Nuclear Information System (INIS)
2002-01-01
This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas
Effect size and statistical power in the rodent fear conditioning literature - A systematic review.
Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B
2018-01-01
Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.
Quantum Statistics: Is there an effective fermion repulsion or boson attraction?
Mullin, W. J.; Blaylock, G.
2003-01-01
Physicists often claim that there is an effective repulsion between fermions, implied by the Pauli principle, and a corresponding effective attraction between bosons. We examine the origins of such exchange force ideas, the validity for them, and the areas where they are highly misleading. We propose that future explanations of quantum statistics should avoid the idea of a effective force completely and replace it with more appropriate physical insights, some of which are suggested here.
Statistical Significance and Effect Size: Two Sides of a Coin.
Fan, Xitao
This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…
The effect of surface corrosion damage on the fatigue life of 6061-T6 aluminum alloy extrusions
Energy Technology Data Exchange (ETDEWEB)
Weber, Matthew; Eason, Paul D.; Özdeş, Hüseyin; Tiryakioğlu, Murat, E-mail: m.tiryakioglu@unf.edu
2017-04-06
An investigation was performed where 6061-T6 extrusions were exposed to a 3.5% NaCl solution at pH 2 for 2 days and 24 days to create distinct surface flaws. The effect of these flaws on the rotating beam fatigue life was then investigated and analyzed by using Wöhler curves, Weibull statistics and scanning electron microscopy (SEM). It was determined that corrosion damage reduced the fatigue life significantly and specimens corroded for both 2-days and 24-days exhibited similar fatigue lives. Statistical analyses showed that fatigue life of all three datasets followed the 3-parameter Weibull distribution and the difference between the fatigue lives of two corroded datasets was statistically insignificant. Analysis of fracture surfaces showed that sizes of pits that led to fatigue crack initiation were very different in the two corroded datasets. Implications of the similarity in fatigue lives despite disparity in surface condition are discussed in detail in the paper.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Common pitfalls in statistical analysis: "No evidence of effect" versus "evidence of no effect"
Directory of Open Access Journals (Sweden)
Priya Ranganathan
2015-01-01
Full Text Available This article is the first in a series exploring common pitfalls in statistical analysis in biomedical research. The power of a clinical trial is the ability to find a difference between treatments, where such a difference exists. At the end of the study, the lack of difference between treatments does not mean that the treatments can be considered equivalent. The distinction between "no evidence of effect" and "evidence of no effect" needs to be understood.
Statistical Models to Assess the Health Effects and to Forecast Ground Level Ozone
Czech Academy of Sciences Publication Activity Database
Schlink, U.; Herbath, O.; Richter, M.; Dorling, S.; Nunnari, G.; Cawley, G.; Pelikán, Emil
2006-01-01
Roč. 21, č. 4 (2006), s. 547-558 ISSN 1364-8152 R&D Projects: GA AV ČR 1ET400300414 Institutional research plan: CEZ:AV0Z10300504 Keywords : statistical models * ground level ozone * health effects * logistic model * forecasting * prediction performance * neural network * generalised additive model * integrated assessment Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.992, year: 2006
Gravitational lensing of gravitational waves: a statistical perspective
Li, Shun-Sheng; Mao, Shude; Zhao, Yuetong; Lu, Youjun
2018-05-01
In this paper, we study the strong gravitational lensing of gravitational waves (GWs) from a statistical perspective, with particular focus on the high frequency GWs from stellar binary black hole coalescences. These are most promising targets for ground-based detectors such as Advanced Laser Interferometer Gravitational Wave Observatory (aLIGO) and the proposed Einstein Telescope (ET) and can be safely treated under the geometrical optics limit for GW propagation. We perform a thorough calculation of the lensing rate, by taking account of effects caused by the ellipticity of lensing galaxies, lens environments, and magnification bias. We find that in certain GW source rate scenarios, we should be able to observe strongly lensed GW events once per year (˜1 yr-1) in the aLIGO survey at its design sensitivity; for the proposed ET survey, the rate could be as high as ˜80 yr-1. These results depend on the estimate of GW source abundance, and hence can be correspondingly modified with an improvement in our understanding of the merger rate of stellar binary black holes. We also compute the fraction of four-image lens systems in each survey, predicting it to be ˜30 per cent for the aLIGO survey and ˜6 per cent for the ET survey. Finally, we evaluate the possibility of missing some images due to the finite survey duration, by presenting the probability distribution of lensing time delays. We predict that this selection bias will be insignificant in future GW surveys, as most of the lens systems ({˜ } 90{per cent}) will have time delays less than ˜1 month, which will be far shorter than survey durations.
Statistical shear lag model - unraveling the size effect in hierarchical composites.
Wei, Xiaoding; Filleter, Tobin; Espinosa, Horacio D
2015-05-01
Numerous experimental and computational studies have established that the hierarchical structures encountered in natural materials, such as the brick-and-mortar structure observed in sea shells, are essential for achieving defect tolerance. Due to this hierarchy, the mechanical properties of natural materials have a different size dependence compared to that of typical engineered materials. This study aimed to explore size effects on the strength of bio-inspired staggered hierarchical composites and to define the influence of the geometry of constituents in their outstanding defect tolerance capability. A statistical shear lag model is derived by extending the classical shear lag model to account for the statistics of the constituents' strength. A general solution emerges from rigorous mathematical derivations, unifying the various empirical formulations for the fundamental link length used in previous statistical models. The model shows that the staggered arrangement of constituents grants composites a unique size effect on mechanical strength in contrast to homogenous continuous materials. The model is applied to hierarchical yarns consisting of double-walled carbon nanotube bundles to assess its predictive capabilities for novel synthetic materials. Interestingly, the model predicts that yarn gauge length does not significantly influence the yarn strength, in close agreement with experimental observations. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Effect of the absolute statistic on gene-sampling gene-set analysis methods.
Nam, Dougu
2017-06-01
Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.
Income and Democracy: A Comment on Acemoglu, Johnson, Robinson, and Yared (2008)
DEFF Research Database (Denmark)
Paldam, Martin; Gundlach, Erich
Acemoglu, Johnson, Robinson, and Yared (2008) demonstrate that estimation of the standard adjustment model with country-fixed and time-fixed effects removes the statistical significance of income as a causal factor of democracy. We argue that their empirical approach must produce insignificant...
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Accuracy in activation analysis: count rate effects
International Nuclear Information System (INIS)
Lindstrom, R.M.; Fleming, R.F.
1980-01-01
The accuracy inherent in activation analysis is ultimately limited by the uncertainty of counting statistics. When careful attention is paid to detail, several workers have shown that all systematic errors can be reduced to an insignificant fraction of the total uncertainty, even when the statistical limit is well below one percent. A matter of particular importance is the reduction of errors due to high counting rate. The loss of counts due to random coincidence (pulse pileup) in the amplifier and to digitization time in the ADC may be treated as a series combination of extending and non-extending dead times, respectively. The two effects are experimentally distinct. Live timer circuits in commercial multi-channel analyzers compensate properly for ADC dead time for long-lived sources, but not for pileup. Several satisfactory solutions are available, including pileup rejection and dead time correction circuits, loss-free ADCs, and computed corrections in a calibrated system. These methods are sufficiently reliable and well understood that a decaying source can be measured routinely with acceptably small errors at a dead time as high as 20 percent
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Linear mixed-effects models for central statistical monitoring of multicenter clinical trials
Desmet, L.; Venet, D.; Doffagne, E.; Timmermans, C.; BURZYKOWSKI, Tomasz; LEGRAND, Catherine; BUYSE, Marc
2014-01-01
Multicenter studies are widely used to meet accrual targets in clinical trials. Clinical data monitoring is required to ensure the quality and validity of the data gathered across centers. One approach to this end is central statistical monitoring, which aims at detecting atypical patterns in the data by means of statistical methods. In this context, we consider the simple case of a continuous variable, and we propose a detection procedure based on a linear mixed-effects model to detect locat...
Measuring the health effects of gender.
Phillips, S P
2008-04-01
The health effects of gender are mediated via group-level constraints of sex roles and norms, discrimination and marginalisation of individuals, and internalisation of the stresses of role discordance. Although gender is frequently a lens through which data are interpreted there are few composite measures that insert gender as an independent variable into research design. Instead, sex disaggregation of data is often conflated with gender, identifying statistically significant but sometimes clinically insignificant sex differences. To directly assess the impact of gender on wellbeing requires development of group and individual-level derived variables. At the ecological level such a summative variable could be composed of a selection of group-level measures of equality between sexes. This gender index could be used in ecological and individual-level studies of health outcomes. A quantitative indicator of gender role acceptance and of the personal effects of gender inequities could insert the often hidden variable of gender into individual-level clinical research.
Directory of Open Access Journals (Sweden)
Laura Badenes-Ribera
2018-06-01
Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not
Safety bey statistics? A critical view on statistical methods applied in health physics
International Nuclear Information System (INIS)
Kraut, W.
2016-01-01
The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.
Effect size and statistical power in the rodent fear conditioning literature – A systematic review
Macleod, Malcolm R.
2018-01-01
Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451
Directory of Open Access Journals (Sweden)
Manish Banker
2017-01-01
Full Text Available Background: Vitamin D and its active metabolite, 1,25-dihydroxy vitamin D (1,25-(OH2D3, play a significant role in reproduction. Aim: To assess the effect of serum 25-hydroxy vitamin D level on oocyte quality and endometrial receptivity by studying oocyte donors and their recipients. Materials and Methods: This prospective study consisted of two groups: Group A (recipient group and Group B (donor group. All the participants of Groups A1 and B1 as well as Groups A2 and B2 were subcategorized into vitamin D-deficient (<20 ng/mL and vitamin D replete-insufficient (20 to ≥30 ng/mL, respectively. Results: In the recipient group, out of the 192 participants, 123 were in A1 group, and 69 were in A2 group. In donor group, out of the 99 participants, 54 were in B1 group, and 45 in B2 group. In the recipient group, Group A2 had a higher clinical pregnancy rate, implantation rate and ongoing pregnancy rate, and a lower abortion rate as compared to that of A1, but these are statistically insignificant. The difference in endometrial thickness and number of embryos transferred between both groups was insignificant. In the donor group, the total number of days of controlled ovarian hyperstimulation, the dose of gonadotropins, the number of oocytes retrieved, the percentage of mature oocytes, and the percentage of usable embryos were higher in Group B2 than those in Group B1, but these are statistically insignificant. The fertilization rate was statistically insignificant between Groups B1 and B2. Conclusion: Vitamin D deficiency leads to lower reproductive outcomes, though not statistically significant and, thereby, does not have a negative influence on in-vitro fertilization–intracytoplasmic sperm injection outcomes.
Directory of Open Access Journals (Sweden)
Chaeyoung Lee
2012-11-01
Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.
Grietens, H; Hellinckx, W
Statistical metaanalyses on the effects of residential treatment for juvenile offenders were reviewed to examine the mean effect sizes and reductions of recidivism reported for this group. Five metaanalyses (three on North American and two on European studies) were selected and synthesized in a
Effect of Control Mode and Test Rate on the Measured Fracture Toughness of Advanced Ceramics
Hausmann, Bronson D.; Salem, Jonathan A.
2018-01-01
The effects of control mode and test rate on the measured fracture toughness of ceramics were evaluated by using chevron-notched flexure specimens in accordance with ASTM C1421. The use of stroke control gave consistent results with about 2% (statistically insignificant) variation in measured fracture toughness for a very wide range of rates (0.005 to 0.5 mm/min). Use of strain or crack mouth opening displacement (CMOD) control gave approx. 5% (statistically significant) variation over a very wide range of rates (1 to 80 µm/m/s), with the measurements being a function of rate. However, the rate effect was eliminated by use of dry nitrogen, implying a stress corrosion effect rather than a stability effect. With the use of a nitrogen environment during strain controlled tests, fracture toughness values were within about 1% over a wide range of rates (1 to 80 micons/m/s). CMOD or strain control did allow stable crack extension well past maximum force, and thus is preferred for energy calculations. The effort is being used to confirm recommendations in ASTM Test Method C1421 on fracture toughness measurement.
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Alternative interpretations of statistics on health effects of low-level radiation
International Nuclear Information System (INIS)
Hamilton, L.D.
1983-01-01
Four examples of the interpretation of statistics of data on low-level radiation are reviewed: (a) genetic effects of the atomic bombs at Hiroshima and Nagasaki, (b) cancer at Rocky Flats, (c) childhood leukemia and fallout in Utah, and (d) cancer among workers at the Portsmouth Naval Shipyard. Aggregation of data, adjustment for age, and other problems related to the determination of health effects of low-level radiation are discussed. Troublesome issues related to post hoc analysis are considered
Effect of aging and lumbar spondylosis on lumbar lordosis
Directory of Open Access Journals (Sweden)
Francis Osita Okpala
2018-01-01
Full Text Available Background: Lumbar lordosis (LL, the anterior convexity of the lumbar spine in the mid-sagittal plane, gives the spine some resilience and helps in protecting it from compressive forces because some of the force is taken by the anterior longitudinal ligaments. In aging and lumbar spondylosis, the intervertebral discs undergo the same degenerative changes though at different rates, and in both, while some authors reported a straightening of LL, others reported no significant change. This morphologic information would hopefully influence therapeutic decision-making, particularly in lumbar spondylosis, which though usually asymptomatic, is a common cause of low back pain. Aim: The aim of the study was to investigate the effect of aging and lumbar spondylosis on LL. Subjects and Methods: Lumbosacral joint angle (LSJA, an angular measure of LL, was retrospectively measured in 252 normal and 329 spondylotic adolescent and adult supine lateral lumbosacral spine archival radiographs, and data were analyzed with IBM SPSS Statistics 23.0 (New York, USA. Results: Normal LSJA range was 5°–39°; the mean was 18.7° and showed insignificant variation with gender and aging. Spondylotic range was 5°–40° and the mean (20.8° differed from the normal mean by about 2°, which probably have inconsequential effect on the lumbar curvature, suggesting that the normal and spondylotic mean values are essentially equal. The spondylotic mean also showed insignificant variation with aging and inconsequential 1° gender difference in favor of females. Conclusion: LL is substantially maintained in aging and lumbar spondylosis.
Statistical data analysis using SAS intermediate statistical methods
Marasinghe, Mervyn G
2018-01-01
The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Effects of Peers and Social Environment on Adolescent Psychological Well-Being
Directory of Open Access Journals (Sweden)
Andrew J. Hussey
2013-07-01
Full Text Available We use data from Add Health to estimate models of peer effects and effects of social environment on adolescent psychological well-being. Past literature has focused mostly on the role of peers on adolescents, notably on schooling (GPA, high school graduation, etc. and risk behavioral (smoking, drinking, drug use, etc. outcomes. Our study’s core innovation lies in the conceptual testing of the hypothesis that an enlarged adolescent social environment encompasses support from peers, school, parents, and the neighborhood. In this paper, we isolate the effects of each of these groups on adolescent psychological well-being and find significant effects of support from schools and parents. However, peer effects are insignificant except for the baseline Ordinary Least Squares (OLS model. Separate models for males and females and different age groups are also estimated and similar results are found, although the effects are greatest during late adolescence. Given the likely endogeneity of peer group formation, we also use an instrumental variables (IV approach. The IV results indicate that peer effects are not statistically significant, but otherwise mimic OLS estimates, supporting the presence of a multi-faceted social network influencing adolescent health. These results, reinforced by further statistical testing, suggest that past work limiting influence on adolescent behavior or outcome to only the peers tends to be incomplete.
International Nuclear Information System (INIS)
Svistun, R.P.; Babej, Yu.I.; Tkachenko, N.N.
1976-01-01
Statistical theories of the scale effect in the fatigue failure of 40KH18N9T, 10 and 20 steels have been verified. The theories are shown to be not invariably suitable for a satisfactory exlanation of the fatigue strength of the samples with respect to their dimensions. One of the main reasons for displaying the scale effect in the process of steel fatigue is the sample self-heating, i.e. a temperature factor which in many cases overlaps a statistical one
Energy Technology Data Exchange (ETDEWEB)
Svistun, R P; Babei, Yu I; Tkachenko, N N [AN Ukrainskoj SSR, Lvov. Fiziko-Mekhanicheskij Inst.; L' vovskij Lesotekhnicheskij Inst. (Ukrainian SSR))
1976-01-01
Statistical theories of the scale effect in the fatigue failure of 40KH18N9T, 10 and 20 steels have been verified. The theories are shown to be not invariably suitable for a satisfactory exlanation of the fatigue strength of the samples with respect to their dimensions. One of the main reasons for displaying the scale effect in the process of steel fatigue is the sample self-heating, i.e. a temperature factor which in many cases overlaps a statistical one.
The effects of dynamics on statistical emission
International Nuclear Information System (INIS)
Friedman, W.A.
1989-01-01
The dynamical processes which occur during the disassembly of an excited nuclear system influence predictions arising from a statistical treatment of the decay of that system. Changes, during the decay period, in such collective properties as angular momentum, density, and kinetic energy of the emitting source affect both the mass and energy spectra of the emitted fragments. This influence will be examined. The author will explore the influence of nuclear compressibility on the decay process, in order to determine what information can be learned about this property from the products of decay. He will compare the relationship between disparate scenarios of decay: a succession of binary decays, each governed by statistics; and a full microcanonical distribution at a single freeze-out density. The author hopes to learn from the general nature of these two statistical predictions when one or the other might be more realistic, and what signatures resulting from the two models might be used to determine which accounts best for specific experimental results
Directory of Open Access Journals (Sweden)
Guozhu Zhang
Full Text Available Zebrafish have become an important alternative model for characterizing chemical bioactivity, partly due to the efficiency at which systematic, high-dimensional data can be generated. However, these new data present analytical challenges associated with scale and diversity. We developed a novel, robust statistical approach to characterize chemical-elicited effects in behavioral data from high-throughput screening (HTS of all 1,060 Toxicity Forecaster (ToxCast™ chemicals across 5 concentrations at 120 hours post-fertilization (hpf. Taking advantage of the immense scale of data for a global view, we show that this new approach reduces bias introduced by extreme values yet allows for diverse response patterns that confound the application of traditional statistics. We have also shown that, as a summary measure of response for local tests of chemical-associated behavioral effects, it achieves a significant reduction in coefficient of variation compared to many traditional statistical modeling methods. This effective increase in signal-to-noise ratio augments statistical power and is observed across experimental periods (light/dark conditions that display varied distributional response patterns. Finally, we integrated results with data from concomitant developmental endpoint measurements to show that appropriate statistical handling of HTS behavioral data can add important biological context that informs mechanistic hypotheses.
On the suboptimality of single-factor exercise strategies for Bermudan swaptions
DEFF Research Database (Denmark)
Svenstrup, Mikkel
2005-01-01
insignificant losses. Furthermore, I find that the conditional model risk as defined in Longstaff et al. [2001. Journal of Finance Economics 62, 39-66] is statistically insignificant given the number of observations. Additional tests using the Primal-Dual algorithm of Andersen and Broadie [2004. Management...
Genetic effects of low level radiation
International Nuclear Information System (INIS)
Sumner, D.
1988-01-01
The author outlines the evidence for genetic effects. The incidence of congenital abnormalities, stillbirths and child deaths has been examined in 70,000 pregnancies in Hiroshima and Nagasaki and compared with pregnancies in an unirradiated control group. No difference was detected in incidence of congenital abnormalities of stillbirths, but there was a small insignificant increase in child deaths when both parents were exposed. The number of children born with chromosome aberrations was slightly higher, but insignificant in the exposed group compared with controls. However, surveys of congenital malformations in children of radiologists and in children of Hanford workers suggest a genetic effect of radiation. Absolute and relative methods of calculating risks and the ICRP risk factor is also briefly discussed. (U.K.)
The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning
Koparan, Timur; Güven, Bülent
2014-01-01
This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…
An investigation of the effects of droplet impact angle in thermal spray deposition
International Nuclear Information System (INIS)
Smith, M.F.; Neiser, R.A.; Dykhuizen, R.C.
1994-01-01
It is widely held that spraying at off-normal angles can influence deposition efficiency and the properties of the deposited material. However, little quantitative information on such effects has been published. This paper reports on a series of experiments to investigate the angular dependence of deposition efficiency, surface roughness, and porosity for several thermal spray materials and processes at incidence angles ranging from 90 degree to 30 degree relative to the substrate surface. At incidence angles from 90 degree out to 60 degree, the observed changes were small and often statistically insignificant. Some significant changes began to appear at 45 degree, and at 30 degree significant changes were observed for nearly all materials and processes: deposition efficiency decreased while surface roughness and porosity increased. It is proposed that droplet splashing may cause some of the observed effects
Effective field theory of statistical anisotropies for primordial bispectrum and gravitational waves
Energy Technology Data Exchange (ETDEWEB)
Rostami, Tahereh; Karami, Asieh; Firouzjahi, Hassan, E-mail: t.rostami@ipm.ir, E-mail: karami@ipm.ir, E-mail: firouz@ipm.ir [School of Astronomy, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)
2017-06-01
We present the effective field theory studies of primordial statistical anisotropies in models of anisotropic inflation. The general action in unitary gauge is presented to calculate the leading interactions between the gauge field fluctuations, the curvature perturbations and the tensor perturbations. The anisotropies in scalar power spectrum and bispectrum are calculated and the dependence of these anisotropies to EFT couplings are presented. In addition, we calculate the statistical anisotropy in tensor power spectrum and the scalar-tensor cross correlation. Our EFT approach incorporates anisotropies generated in models with non-trivial speed for the gauge field fluctuations and sound speed for scalar perturbations such as in DBI inflation.
Analysis of statistical misconception in terms of statistical reasoning
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah
2015-01-01
Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.
Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics
DEFF Research Database (Denmark)
Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K
2016-01-01
Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...
James, Ryan G.; Mahoney, John R.; Crutchfield, James P.
2017-06-01
One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.
Cisatracurium dose–response relationship in patients with chronic liver disease
Directory of Open Access Journals (Sweden)
Mohamed Z. Ali
2014-04-01
Results: The preoperative laboratory parameters showed statistically significant differences between the two groups regarding serum albumin, total bilirubin, ALT, AST, PT, PC and INR. The operative data showed statistically insignificant difference between the two groups regarding the 1st dose response (p = 0.152, the estimated ED80 (p = 0.886 and the calculated 2nd dose (p = 0.886 and statistically significant differences between the two groups regarding the 2nd dose response (p = 0.006, the measured ED50 (p = 0.010 and the measured ED95 (p = 0.001. In conclusion, the measured ED50 and ED95 through two-dose dose–response curve technique were clinically insignificant from using the single-dose technique. The dose–response curve of cisatracurium in patients with chronic liver disease was clinically insignificant in comparison with healthy subjects.
Long-range memory and non-Markov statistical effects in human sensorimotor coordination
M. Yulmetyev, Renat; Emelyanova, Natalya; Hänggi, Peter; Gafarov, Fail; Prokhorov, Alexander
2002-12-01
In this paper, the non-Markov statistical processes and long-range memory effects in human sensorimotor coordination are investigated. The theoretical basis of this study is the statistical theory of non-stationary discrete non-Markov processes in complex systems (Phys. Rev. E 62, 6178 (2000)). The human sensorimotor coordination was experimentally studied by means of standard dynamical tapping test on the group of 32 young peoples with tap numbers up to 400. This test was carried out separately for the right and the left hand according to the degree of domination of each brain hemisphere. The numerical analysis of the experimental results was made with the help of power spectra of the initial time correlation function, the memory functions of low orders and the first three points of the statistical spectrum of non-Markovity parameter. Our observations demonstrate, that with the regard to results of the standard dynamic tapping-test it is possible to divide all examinees into five different dynamic types. We have introduced the conflict coefficient to estimate quantitatively the order-disorder effects underlying life systems. The last one reflects the existence of disbalance between the nervous and the motor human coordination. The suggested classification of the neurophysiological activity represents the dynamic generalization of the well-known neuropsychological types and provides the new approach in a modern neuropsychology.
The Effects of Pre-Lecture Quizzes on Test Anxiety and Performance in a Statistics Course
Brown, Michael J.; Tallon, Jennifer
2015-01-01
The purpose of our study was to examine the effects of pre-lecture quizzes in a statistics course. Students (N = 70) from 2 sections of an introductory statistics course served as participants in this study. One section completed pre-lecture quizzes whereas the other section did not. Completing pre-lecture quizzes was associated with improved exam…
An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.
Capraro, Mary Margaret
This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…
An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.
Tarlow, Kevin R
2017-07-01
Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.
Effect of bleaching on color change and surface topography of composite restorations.
Pruthi, Gunjan; Jain, Veena; Kandpal, H C; Mathur, Vijay Prakash; Shah, Naseem
2010-01-01
This study was conducted to determine the effect of 15% carbamide peroxide bleaching agent on color change and surface topography of different composite veneering materials (Filtek Z350 (3M ESPE), Esthet X (Dentsply India), and Admira (Voco, Germany). Methods. 30 samples were fabricated for evaluation of color change using CIELAB color system and Gonioreflectometer (GK 311/M, ZEISS). 45 disc-shaped specimens were made for evaluation of surface topography after bleaching (Nupro White Gold; Dentsply) using SEM. Statistical analysis. One way ANOVA and Multiple comparison tests were used to analyze the data. Statistical significance was declared if the P value was .05 or less. Results and conclusion. All the specimens showed significant discoloration (ΔE > 3.3) after their immersion in solutions representing food and beverages. The total color change after bleaching as compared to baseline color was significant in Filtek Z350 (P = .000) and Esthet X (P = .002), while it was insignificant for Admira (P = .18). Esthet X showed maximum surface roughness followed by Admira and Filtek Z350. Bleaching was effective in reducing the discoloration to a clinically acceptable value in all the three groups (ΔE < 3.3).
Effect of Bleaching on Color Change and Surface Topography of Composite Restorations
Directory of Open Access Journals (Sweden)
Gunjan Pruthi
2010-01-01
Full Text Available This study was conducted to determine the effect of 15% carbamide peroxide bleaching agent on color change and surface topography of different composite veneering materials (Filtek Z350 (3M ESPE, Esthet X (Dentsply India, and Admira (Voco, Germany. Methods. 30 samples were fabricated for evaluation of color change using CIELAB color system and Gonioreflectometer (GK 311/M, ZEISS. 45 disc-shaped specimens were made for evaluation of surface topography after bleaching (Nupro White Gold; Dentsply using SEM. Statistical analysis. One way ANOVA and Multiple comparison tests were used to analyze the data. Statistical significance was declared if the P value was .05 or less. Results and conclusion. All the specimens showed significant discoloration (ΔE>3.3 after their immersion in solutions representing food and beverages. The total color change after bleaching as compared to baseline color was significant in Filtek Z350 (P=.000 and Esthet X (P=.002, while it was insignificant for Admira (P=.18. Esthet X showed maximum surface roughness followed by Admira and Filtek Z350. Bleaching was effective in reducing the discoloration to a clinically acceptable value in all the three groups (ΔE<3.3.
Statistical methods for determining the effect of mammography screening
DEFF Research Database (Denmark)
Lophaven, Søren
2016-01-01
In an overview of five randomised controlled trials from Sweden, a reduction of 29% was found in breast cancer mortality in women aged 50-69 at randomisation after a follow up of 5-13 years. Organised, population based, mammography service screening was introduced on the basis of these resultsin...... in 2007-2008. Women aged 50-69 were invited to screening every second year. Taking advantage of the registers of population and health, we present statistical methods for evaluating the effect of mammography screening on breast cancer mortality (Olsen et al. 2005, Njor et al. 2015 and Weedon-Fekjær etal...
Effects of non-thermal mobile phone radiation on breast adenocarcinoma cells
Directory of Open Access Journals (Sweden)
Zen Fourie
2011-09-01
Full Text Available Mobile phone usage currently exceeds landline communication in Africa. The extent of this usage has raised concerns about the long-term health effects of the ongoing use of mobile phones. To assess the physiological effects of radiation from mobile phones in vitro, MCF-7 breast adenocarcinoma cells were exposed to 2W/kg non-thermal 900-MHz mobile phone radiation. The effects investigated were those on metabolic activity, cell morphology, cell cycle progression, phosphatidylserine (PS externalisation and the generation of reactive oxygen species and nitrogen species. Statistically insignificant increases in mitochondrial dehydrogenase activity were observed in irradiated cells when compared to controls. Fluorescent detection of F-actin demonstrated an increase in F-actin stress fibre formation in irradiated MCF-7 cells. Cell cycle progression revealed no statistically significant variation. A small increase in early and late apoptotic events in irradiated MCF-7 cells was observed. No statistically significant changes were observed in reactive oxygen and reactive nitrogen species generation. In addition, quantitative and qualitative analyses of cell cycle activity and nuclear and cytosolic changes, respectively, revealed no significant changes. In conclusion, exposure to 1 h of 900-MHz irradiation induced an increase in PS externalisation and an increase in the formation of F-actin stress fibres in MCF-7 cells. Data obtained from this study, and their correlation with other studies, provides intriguing links between radio frequency radiation and cellular events and warrant further investigation.
Mills, Jada Jamerson
There is a need for STEM (science, technology, engineering, and mathematics) education to be taught effectively in elementary schools. In order to achieve this, teacher preparation programs should graduate confident, content strong teachers to convey knowledge to elementary students. This study used interdisciplinary collaboration between the School of Education and the College of Liberal Arts through a Learning-by-Teaching method (LdL): Lernen durch Lernen in German. Pre-service teacher (PST) achievement levels of understanding science concepts based on pretest and posttest data, quality of lesson plans developed, and enjoyment of the class based on the collaboration with science students. The PSTs enrolled in two treatment sections of EDEL 404: Science in the Elementary Classroom collaborated with science students enrolled in BISC 327: Introductory Neuroscience to enhance their science skills and create case-based lesson plans on neurothology topics: echolocation, electrosensory reception, steroid hormones, and vocal learning. The PSTs enrolled in the single control section of EDEL 404 collaborated with fellow elementary education majors to develop lesson plans also based on the same selected topics. Qualitative interviews of education faculty, science faculty, and PSTs provided depth to the quantitative findings. Upon lesson plan completion, in-service teachers also graded the two best and two worst plans for the treatment and control sections and a science reviewer graded the plans for scientific accuracy. Statistical analyses were conducted for hypotheses, and one significant hypothesis found that PSTs who collaborated with science students had more positive science lesson plan writing attitudes than those who did not. Despite overall insignificant statistical analyses, all PSTs responded as more confident after collaboration. Additionally, interviews provided meaning and understanding to the insignificant statistical results as well as scientific accuracy of
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
Ramkilowan, A.; Griffith, D. J.
2017-10-01
Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.
Statistical study of chemical additives effects in the waste cementation
International Nuclear Information System (INIS)
Tello, Cledola C.O. de; Diniz, Paula S.; Haucz, Maria J.A.
1997-01-01
This paper presents the statistical study, that was carried out to analyse the chemical additives effect in the waste cementation process. Three different additives from two industries were tested: set accelerator, set retarder and super plasticizers, in cemented pates with and without bentonite. The experiments were planned in accordance with the 2 3 factorial design, so that the effect of each type of additive, its quantity and manufacturer in cemented paste and specimens could be evaluated. The results showed that the use of these can improve the cementation process and the product. The admixture quantity and the association with bentonite were the most important factors affecting the process and product characteristics. (author). 4 refs., 9 figs., 4 tabs
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Porter, Kristin E.
2018-01-01
Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…
Directory of Open Access Journals (Sweden)
Farzaneh Saadati
Full Text Available Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.
Isotopic study of liver function after narcosis in small animals
International Nuclear Information System (INIS)
Le Qui Cuong; Kiss, Bela; Jakab, Tivadar; Szilvasi, Istvan; Spett, Borbala
1984-01-01
Dinamic functional study of the liver was performed by sup(99m)Tc-TECHIDA in narcotized (Halothane) mice and rabbits. Hepatic uptake of the radiopharmaceutical decreased in narcotized group significantly. Excretion also decreased but statistically insignificantly. These alterations in the liver function could be attributed to the hypotensive effect of Halothane. (author)
Vortices in superconducting films: Statistics and fractional quantum Hall effect
International Nuclear Information System (INIS)
Dziarmaga, J.
1996-01-01
We present a derivation of the Berry phase picked up during exchange of parallel vortices. This derivation is based on the Bogolubov endash de Gennes formalism. The origin of the Magnus force is also critically reanalyzed. The Magnus force can be interpreted as an interaction with the effective magnetic field. The effective magnetic field may be even of the order 10 6 T/A. We discuss a possibility of the fractional quantum Hall effect (FQHE) in vortex systems. As the real magnetic field is varied to drive changes in vortex density, the vortex density will prefer to stay at some quantized values. The mere existence of the FQHE does not depend on vortex quantum statistics, although the pattern of the plateaux does. We also discuss how the density of anyonic vortices can lower the effective strengh of the Magnus force, what might be observable in measurements of Hall resistivity. copyright 1996 The American Physical Society
Interacting Effects of Instructions and Presentation Rate on Visual Statistical Learning
Directory of Open Access Journals (Sweden)
Julie eBertels
2015-11-01
Full Text Available The statistical regularities of a sequence of visual shapes can be learned incidentally. Arciuli et al. (2014 recently argued that intentional instructions only improve learning at slow presentation rates as they favor the use of explicit strategies. The aim of the present study was (1 to test this assumption directly by investigating how instructions (incidental vs. intentional and presentation rate (fast vs. slow affect the acquisition of knowledge and (2 to examine how these factors influence the conscious vs. unconscious nature of the knowledge acquired. To this aim, we exposed participants to four triplets of shapes, presented sequentially in a pseudo-random order, and assessed their degree of learning in a subsequent completion task that integrated confidence judgments. Supporting Arciuli et al.’s claim, participant performance only benefited from intentional instructions at slow presentation rates. Moreover, informing participants beforehand about the existence of statistical regularities increased their explicit knowledge of the sequences, an effect that was not modulated by presentation speed. These results support that, although visual statistical learning can take place incidentally and, to some extent, outside conscious awareness, factors such as presentation rate and prior knowledge can boost learning of these regularities, presumably by favoring the acquisition of explicit knowledge.
Directory of Open Access Journals (Sweden)
A. E. Pismak
2016-03-01
Full Text Available Subject of Research. The paper is focused on Wiktionary articles structural organization in the aspect of its usage as the base for semantic network. Wiktionary community references, article templates and articles markup features are analyzed. The problem of numerical estimation for semantic similarity of structural elements in Wiktionary articles is considered. Analysis of existing software for semantic similarity estimation of such elements is carried out; algorithms of their functioning are studied; their advantages and disadvantages are shown. Methods. Mathematical statistics methods were used to analyze Wiktionary articles markup features. The method of semantic similarity computing based on statistics data for compared structural elements was proposed.Main Results. We have concluded that there is no possibility for direct use of Wiktionary articles as the source for semantic network. We have proposed to find hidden similarity between article elements, and for that purpose we have developed the algorithm for calculation of confidence coefficients proving that each pair of sentences is semantically near. The research of quantitative and qualitative characteristics for the developed algorithm has shown its major performance advantage over the other existing solutions in the presence of insignificantly higher error rate. Practical Relevance. The resulting algorithm may be useful in developing tools for automatic Wiktionary articles parsing. The developed method could be used in computing of semantic similarity for short text fragments in natural language in case of algorithm performance requirements are higher than its accuracy specifications.
Kim, Yoonsang; Emery, Sherry
2013-01-01
Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415
Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry
2013-08-01
Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.
More 'mapping' in brain mapping: statistical comparison of effects
DEFF Research Database (Denmark)
Jernigan, Terry Lynne; Gamst, Anthony C.; Fennema-Notestine, Christine
2003-01-01
The term 'mapping' in the context of brain imaging conveys to most the concept of localization; that is, a brain map is meant to reveal a relationship between some condition or parameter and specific sites within the brain. However, in reality, conventional voxel-based maps of brain function......, or for that matter of brain structure, are generally constructed using analyses that yield no basis for inferences regarding the spatial nonuniformity of the effects. In the normal analysis path for functional images, for example, there is nowhere a statistical comparison of the observed effect in any voxel relative...... to that in any other voxel. Under these circumstances, strictly speaking, the presence of significant activation serves as a legitimate basis only for inferences about the brain as a unit. In their discussion of results, investigators rarely are content to confirm the brain's role, and instead generally prefer...
Porter, Kristin E.
2016-01-01
In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…
Noble, Bram; Liu, Jialang; Hackett, Paul
2017-04-01
This paper explores the opportunities and constraints to project-based environmental assessment as a means to support the assessment and management of cumulative environmental effects. A case study of the hydroelectric sector is used to determine whether sufficient information is available over time through project-by-project assessments to support an adequate understanding of cumulative change. Results show inconsistency from one project to the next in terms of the components and indicators assessed, limited transfer of baseline information between project assessments over time, and the same issues and concerns being raised by review panels-even though the projects reviewed are operating in the same watershed and operated by the same proponent. Project environmental assessments must be managed, and coordinated, as part of a larger system of impact assessment, if project-by-project assessments are to provide a meaningful forum for learning and understanding cumulative change. The paper concludes with recommendations for improved project-based assessment practice in support of cumulative effects assessment and management.
The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade
Koparan, Timur; Güven, Bülent
2014-01-01
This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…
Effect of Probiotic Preparation Enriched with Selenium on Qualitative Parameters of Table Eggs
Directory of Open Access Journals (Sweden)
Martin Mellen
2014-05-01
Full Text Available In this experiment the effects of the diet for laying hens supplemented with probiotic product with an organic form of selenium on egg weight, albumen quality, yolk quality and egg shell quality were studied. Isa Brown hens (n=90 were randomly divided at the age of 17 weeks into three groups (30 birds per group. Hens in all groups consumed the complete feed mixture ad libitum. In the control group water for drinking contained no additions. In the first experimental group probiotic product was added to the water, in the second experimental group the same probiotic preparation enriched with 0.8 to 1 mg of organic selenium per 1 g of the product was added to the water. The probiotic preparations were administered at the dose of 15 mg per 6 l of water daily, in both experimental groups. Monitored physical parameters of eggs: egg weight (g, specific egg weight (g/cm3, albumen weight (g, albumen height (mm, albumen index, Haugh units (HJ, yolk weight (g, yolk index, yolk color (°HLR, egg shell weight, egg shell specific weight (g/cm3, egg shell strength (N/cm2, the average eggshell thickness (µm. Experiment lasted 48 weeks. The results showed that egg weight was slightly higher in both experimental groups compared with the control group, differences between the groups were not statistically significant (P>0.05. The values in the order of groups: 60.97 ± 4.97, 61.18 ± 5:00; 61.75 ± 5.89 (g ± SD. Was found insignificant impact of the add probiotic preparation and probiotic preparation enriched with selenium on the quality parameters of table eggs. Yolk index, albumen index, Haugh units and the average egg shell thickness were only slightly, statistically insignificant higher in the experimental groups (P> 0.05.
Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.
Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R
2007-12-01
After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.
Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E
2013-11-15
Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu
Koparan, Timur
2016-02-01
In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Effect of ultrasound frequency on the Nakagami statistics of human liver tissues.
Directory of Open Access Journals (Sweden)
Po-Hsiang Tsui
Full Text Available The analysis of the backscattered statistics using the Nakagami parameter is an emerging ultrasound technique for assessing hepatic steatosis and fibrosis. Previous studies indicated that the echo amplitude distribution of a normal liver follows the Rayleigh distribution (the Nakagami parameter m is close to 1. However, using different frequencies may change the backscattered statistics of normal livers. This study explored the frequency dependence of the backscattered statistics in human livers and then discussed the sources of ultrasound scattering in the liver. A total of 30 healthy participants were enrolled to undergo a standard care ultrasound examination on the liver, which is a natural model containing diffuse and coherent scatterers. The liver of each volunteer was scanned from the right intercostal view to obtain image raw data at different central frequencies ranging from 2 to 3.5 MHz. Phantoms with diffuse scatterers only were also made to perform ultrasound scanning using the same protocol for comparisons with clinical data. The Nakagami parameter-frequency correlation was evaluated using Pearson correlation analysis. The median and interquartile range of the Nakagami parameter obtained from livers was 1.00 (0.98-1.05 for 2 MHz, 0.93 (0.89-0.98 for 2.3 MHz, 0.87 (0.84-0.92 for 2.5 MHz, 0.82 (0.77-0.88 for 3.3 MHz, and 0.81 (0.76-0.88 for 3.5 MHz. The Nakagami parameter decreased with the increasing central frequency (r = -0.67, p < 0.0001. However, the effect of ultrasound frequency on the statistical distribution of the backscattered envelopes was not found in the phantom results (r = -0.147, p = 0.0727. The current results demonstrated that the backscattered statistics of normal livers is frequency-dependent. Moreover, the coherent scatterers may be the primary factor to dominate the frequency dependence of the backscattered statistics in a liver.
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Chandu, G S; Asnani, Pooja; Gupta, Siddarth; Faisal Khan, Mohd.
2015-01-01
Background: Use of alkaline peroxide denture cleanser with different temperature of water could cause a change in surface hardness of the acrylic denture and also has a bleaching effect. The purpose of the study was to determine the effect of increased water content during thermal cycling of hot water-treated acrylic on the surface hardness of acrylic denture base when compared to warm water treated acrylic. And to compare the bleaching effect of alkaline peroxide solution on the acrylic denture base on hot water and warm water treated acrylic. Materials and Methods: Forty samples (10 mm × 10 mm × 2.5 mm) were prepared. After the calculation of the initial hardness 40 samples, each was randomly assigned to two groups. Group A: 20 samples were immersed in 250 ml of warm distilled water at 40°C with alkaline peroxide tablet. Group B: 20 samples were immersed in 250 ml of hot distilled water at 100°C with alkaline peroxide tablet. The surface hardness of each test sample was obtained using the digital hardness testing machine recording the Rockwell hardness number before the beginning of the soaking cycles and after completion of 30 soak cycles and compared. Values were analyzed using paired t-test. Five samples from the Group A and five samples from Group B were put side by side and photographed using a Nikon D 40 digital SLR Camera and the photographs were examined visually to assess the change in color. Results: Acrylic samples immersed in hot water showed a statistically significant decrease of 5.8% in surface hardness. And those immersed in warm water showed a statistically insignificant increase of 0.67% in surface hardness. Samples from the two groups showed clinically insignificant difference in color when compared to each other on examination of the photographs. Conclusion: Thermocycling of the acrylic resin at different water bath temperature at 40°C and 100°C showed significant changes in the surface hardness. PMID:25954074
P.P. Wakker (Peter); D.R.M. Timmermans (Danielle); I. Machielse (Irma)
2007-01-01
textabstractThis paper presents a field study into the effects of statistical information concerning risks on willingness to take insurance, with special attention being paid to the usefulness of these effects for the clients (the insured). Unlike many academic studies, we were able to use in-depth
Student and Professor Gender Effects in Introductory Business Statistics
Haley, M. Ryan; Johnson, Marianne F.; Kuennen, Eric W.
2007-01-01
Studies have yielded highly mixed results as to differences in male and female student performance in statistics courses; the role that professors play in these differences is even less clear. In this paper, we consider the impact of professor and student gender on student performance in an introductory business statistics course taught by…
Quantum-statistical kinetic equations
International Nuclear Information System (INIS)
Loss, D.; Schoeller, H.
1989-01-01
Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived
Tax Shift by Economic Functions and Its Effect on Economic Growth in the European Union
Directory of Open Access Journals (Sweden)
Irena Szarowská
2015-01-01
Full Text Available The aim of the paper is to examine effects of tax shift on economic growth and provide a direct empirical evidence in the European Union (EU. It is used the Eurostat’s definition to categorize tax burden by economic functions and implicit tax rates of consumption, labour and capital are investigated. First, paper summarizes main development of tax shift in a whole EU till 2014 and followed empirical analysis is based on annual panel data of 22 EU Member States in years 1995–2012 (time span is divided into a pre-crisis and a post-crisis period. Explanatory variables are not examined in individual regressions, but the study uses Generalized Method of Moments applied on dynamic panel data and estimations are based on Arellan-Bond estimator (1991. Results confirm positive and statistically significant impact of consumption taxes and weaker but negative effect of labour taxation on economic growth. In a post-crisis period, findings report raising labour taxes as the strongest and the only significant variable. It suggests that harmful effect of labour taxation is enlarging in a time of unfavorable economic conditions. A tax shift on capital taxation has negative but often statistically insignificant impact on economic growth.
A statistical manual for chemists
Bauer, Edward
1971-01-01
A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect
The effectiveness of policy interventions in CEE countries
Directory of Open Access Journals (Sweden)
Alin-Marius ANDRIEȘ
2016-06-01
Full Text Available This paper assesses the effectiveness of intervention measures adopted by central authorities during 2005-2012 in CEE. We investigate their impact on bank stability in 15 countries from CEE using bank-level data and OLS estimation method. The bank stability is proxied by the natural logarithm of the Z-Score and the non-performing loans to gross loans ratio. Empirical findings suggest that interest rates cuts, as well as domestic and foreign liquidity injections have a significant impact on bank stability in Emerging Europe. Moreover, their effectiveness differs according to several bank characteristics. Policy measures adopted by CEE countries significantly reduced the stability of domestic banks, but increased the stability of banks with a lower level of capitalization. The impact on the Z–score of banking system liquidity policy measures and the policy interest rates cuts is significantly lower in the case of domestic banks, amplified for less-capitalized banks (except for the category regarding banks’ solvency, while their impact on large banks remains statistically insignificant.
Elementary statistics for effective library and information service management
Egghe, Leo
2001-01-01
This title describes how best to use statistical data to produce professional reports on library activities. The authors cover data gathering, sampling, graphical representation of data and summary statistics from data, and also include a section on trend analysis. A full bibliography and a subject index make this a key title for any information professional..
Kiekkas, Panagiotis; Panagiotarou, Aliki; Malja, Alvaro; Tahirai, Daniela; Zykai, Rountina; Bakalis, Nick; Stefanopoulos, Nikolaos
2015-12-01
Although statistical knowledge and skills are necessary for promoting evidence-based practice, health sciences students have expressed anxiety about statistics courses, which may hinder their learning of statistical concepts. To evaluate the effects of a biostatistics course on nursing students' attitudes toward statistics and to explore the association between these attitudes and their performance in the course examination. One-group quasi-experimental pre-test/post-test design. Undergraduate nursing students of the fifth or higher semester of studies, who attended a biostatistics course. Participants were asked to complete the pre-test and post-test forms of The Survey of Attitudes Toward Statistics (SATS)-36 scale at the beginning and end of the course respectively. Pre-test and post-test scale scores were compared, while correlations between post-test scores and participants' examination performance were estimated. Among 156 participants, post-test scores of the overall SATS-36 scale and of the Affect, Cognitive Competence, Interest and Effort components were significantly higher than pre-test ones, indicating that the course was followed by more positive attitudes toward statistics. Among 104 students who participated in the examination, higher post-test scores of the overall SATS-36 scale and of the Affect, Difficulty, Interest and Effort components were significantly but weakly correlated with higher examination performance. Students' attitudes toward statistics can be improved through appropriate biostatistics courses, while positive attitudes contribute to higher course achievements and possibly to improved statistical skills in later professional life. Copyright © 2015 Elsevier Ltd. All rights reserved.
2009-01-01
In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032
Renal thorium and uranium excretion in non-exposed subjects: influence of age and gender
International Nuclear Information System (INIS)
Werner, E.; Roth, P.; Wendler, I.; Schramel, P.
1998-01-01
The excretion of 238 U and 232 Th was investigated by ICP-MS in a group of 30 males (mean age 41 ± 18 years, range 7 to 73 years) and 33 females (43 ± 21 years, 11 to 84 years). For the thorium excretion, the geometric mean is 34 μBq/day (SD 1.90) for the whole group, 40 μBq/day (SD 2.01) for the males and 30 μBq/day (SD 1.78) for the females. The difference between the males and females is statistically insignificant. A certain increase in the excretion was observed with increasing age but the correlation coefficient of the linear relationship is statistically insignificant. For uranium, the geometric means (SD) for the whole group, the male subgroup, and the female subgroup, in μBq/day, are 237 (2.50), 287 (2.09), and 200 (2.81). The difference between the two subgroups is statistically insignificant. The excretion increases slightly with age; the correlation coefficient of the linear relationship is statistically significant. The day-to-day fluctuations in the excretion of both Th and U are considerable. (P.A.)
Ibáñez, Sergio J.; García, Javier; Feu, Sebastian; Lorenzo, Alberto; Sampaio, Jaime
2009-01-01
The aim of the present study was to identify the game-related statistics that discriminated basketball winning and losing teams in each of the three consecutive games played in a condensed tournament format. The data were obtained from the Spanish Basketball Federation and included game-related statistics from the Under-20 league (2005-2006 and 2006-2007 seasons). A total of 223 games were analyzed with the following game-related statistics: two and three-point field goal (made and missed), free-throws (made and missed), offensive and defensive rebounds, assists, steals, turnovers, blocks (made and received), fouls committed, ball possessions and offensive rating. Results showed that winning teams in this competition had better values in all game-related statistics, with the exception of three point field goals made, free-throws missed and turnovers (p ≥ 0.05). The main effect of game number was only identified in turnovers, with a statistical significant decrease between the second and third game. No interaction was found in the analysed variables. A discriminant analysis allowed identifying the two-point field goals made, the defensive rebounds and the assists as discriminators between winning and losing teams in all three games. Additionally to these, only the three-point field goals made contributed to discriminate teams in game three, suggesting a moderate effect of fatigue. Coaches may benefit from being aware of this variation in game determinant related statistics and, also, from using offensive and defensive strategies in the third game, allowing to explore or hide the three point field-goals performance. Key points Overall team performances along the three consecutive games were very similar, not confirming an accumulated fatigue effect. The results from the three-point field goals in the third game suggested that winning teams were able to shoot better from longer distances and this could be the result of exhibiting higher conditioning status and
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Kosevich, Yuriy A.; Savin, Alexander V.; Cantarero, Andrés
2013-01-01
We present molecular dynamics simulation of phonon thermal conductivity of semiconductor nanoribbons with an account for phonon quantum statistics. In our semiquantum molecular dynamics simulation, dynamics of the system is described with the use of classical Newtonian equations of motion where the effect of phonon quantum statistics is introduced through random Langevin-like forces with a specific power spectral density (color noise). The color noise describes interaction of the molecular system with the thermostat. The thermal transport of silicon and germanium nanoribbons with atomically smooth (perfect) and rough (porous) edges are studied. We show that the existence of rough (porous) edges and the quantum statistics of phonon change drastically the low-temperature thermal conductivity of the nanoribbon in comparison with that of the perfect nanoribbon with atomically smooth edges and classical phonon dynamics and statistics. The rough-edge phonon scattering and weak anharmonicity of the considered lattice produce a weakly pronounced maximum of thermal conductivity of the nanoribbon at low temperature.
The effective thermal conductivity of porous media based on statistical self-similarity
International Nuclear Information System (INIS)
Kou Jianlong; Wu Fengmin; Lu Hangjun; Xu Yousheng; Song Fuquan
2009-01-01
A fractal model is presented based on the thermal-electrical analogy technique and statistical self-similarity of fractal saturated porous media. A dimensionless effective thermal conductivity of saturated fractal porous media is studied by the relationship between the dimensionless effective thermal conductivity and the geometrical parameters of porous media with no empirical constant. Through this study, it is shown that the dimensionless effective thermal conductivity decreases with the increase of porosity (φ) and pore area fractal dimension (D f ) when k s /k g >1. The opposite trends is observed when k s /k g t ). The model predictions are compared with existing experimental data and the results show that they are in good agreement with existing experimental data.
International Nuclear Information System (INIS)
Dowdy, E.J.; Hansen, G.E.; Robba, A.A.; Pratt, J.C.
1980-01-01
The complete formalism for the use of statistical neutron fluctuation measurements for the nondestructive assay of fissionable materials has been developed. This formalism includes the effect of detector deadtime, neutron multiplicity, random neutron pulse contributions from (α,n) contaminants in the sample, and the sample multiplication of both fission-related and background neutrons
Directory of Open Access Journals (Sweden)
Dr. Ayla Zehra Öncer
2013-07-01
Full Text Available The purpose of this study was to examine the possible effects of transactional and transformational leadership styles on entrepreneurial orientation. Transactional leadership discussed with two dimensional model consists of contingent reward and active management by exception, where transformational leadership discussed with four dimensional model consists of idealized influence, inspirational motivation, intellectual stimulation and individualized consideration. On the other hand entrepreneurial orientation was examined under three dimensions as; innovativeness, risk taking and proactiveness. The survey of this study is conducted on 171 employees of three multinational companies in Istanbul. The obtained data from the questionnaires are analyzed through the SPSS statistical packaged software. Analyses results showed that transactional leadership affects only proactiveness dimension while transformational leadership affects all three dimensions of entrepreneurial orientation. The only insignificance among transformational leadership and entrepreneurial orientation is between individualized consideration and risk taking.
McKinley, Christopher J; Limbu, Yam; Jayachandran, C N
2017-04-01
In two separate investigations, we examined the persuasive effectiveness of statistical versus exemplar appeals on Indian adults' smoking cessation and mammography screening intentions. To more comprehensively address persuasion processes, we explored whether message response and perceived message effectiveness functioned as antecedents to persuasive effects. Results showed that statistical appeals led to higher levels of health intentions than exemplar appeals. In addition, findings from both studies indicated that statistical appeals stimulated more attention and were perceived as more effective than anecdotal accounts. Among male smokers, statistical appeals also generated greater cognitive processing than exemplar appeals. Subsequent mediation analyses revealed that message response and perceived message effectiveness fully carried the influence of appeal format on health intentions. Given these findings, future public health initiatives conducted among similar populations should design messages that include substantive factual information while ensuring that this content is perceived as credible and valuable.
Side effects of being blue: influence of sad mood on visual statistical learning.
Directory of Open Access Journals (Sweden)
Julie Bertels
Full Text Available It is well established that mood influences many cognitive processes, such as learning and executive functions. Although statistical learning is assumed to be part of our daily life, as mood does, the influence of mood on statistical learning has never been investigated before. In the present study, a sad vs. neutral mood was induced to the participants through the listening of stories while they were exposed to a stream of visual shapes made up of the repeated presentation of four triplets, namely sequences of three shapes presented in a fixed order. Given that the inter-stimulus interval was held constant within and between triplets, the only cues available for triplet segmentation were the transitional probabilities between shapes. Direct and indirect measures of learning taken either immediately or 20 minutes after the exposure/mood induction phase revealed that participants learned the statistical regularities between shapes. Interestingly, although participants from the sad and neutral groups performed similarly in these tasks, subjective measures (confidence judgments taken after each trial revealed that participants who experienced the sad mood induction showed increased conscious access to their statistical knowledge. These effects were not modulated by the time delay between the exposure/mood induction and the test phases. These results are discussed within the scope of the robustness principle and the influence of negative affects on processing style.
Effects of acoustic levitation on the development of zebrafish, Danio rerio, embryos.
Sundvik, Maria; Nieminen, Heikki J; Salmi, Ari; Panula, Pertti; Hæggström, Edward
2015-09-04
Acoustic levitation provides potential to characterize and manipulate material such as solid particles and fluid in a wall-less environment. While attempts to levitate small animals have been made, the biological effects of such levitation have been scarcely documented. Here, our goal was to explore if zebrafish embryos can be levitated (peak pressures at the pressure node and anti-node: 135 dB and 144 dB, respectively) with no effects on early development. We levitated the embryos (n = 94) at 2-14 hours post fertilization (hpf) for 1000 (n = 47) or 2000 seconds (n = 47). We compared the size and number of trunk neuromasts and otoliths in sonicated samples to controls (n = 94), and found no statistically significant differences (p > 0.05). While mortality rate was lower in the control group (22.3%) compared to that in the 1000 s (34.0%) and 2000 s (42.6%) levitation groups, the differences were statistically insignificant (p > 0.05). The results suggest that acoustic levitation for less than 2000 sec does not interfere with the development of zebrafish embryos, but may affect mortality rate. Acoustic levitation could potentially be used as a non-contacting wall-less platform for characterizing and manipulating vertebrae embryos without causing major adverse effects to their development.
Statistical data filtration in neutron coincidence counting
International Nuclear Information System (INIS)
Beddingfield, D.H.; Menlove, H.O.
1992-11-01
We assessed the effectiveness of statistical data filtration to minimize the contribution of matrix materials in 200-ell drums to the nondestructive assay of plutonium. Those matrices were examined: polyethylene, concrete, aluminum, iron, cadmium, and lead. Statistical filtration of neutron coincidence data improved the low-end sensitivity of coincidence counters. Spurious data arising from electrical noise, matrix spallation, and geometric effects were smoothed in a predictable fashion by the statistical filter. The filter effectively lowers the minimum detectable mass limit that can be achieved for plutonium assay using passive neutron coincidence counting
Effect of Postural Change on Plasma Insulin Concentration in Normal Volunteer
Energy Technology Data Exchange (ETDEWEB)
Sung, Ho Kyung; Koh, Joo Whan; Joo, Jong Koo; Kim, Jin Yong; Lee, Jang Kyu [Korea Atomic Research Institute, Seoul (Korea, Republic of)
1974-03-15
The concentrations of some blood constituents are known to be influenced by the postural change. The blood glucose and insulin concentrations were measured, first, in the supine, and then (30 minutes later) in the erect positions under the fasting state. The effects of a duretic, furose-mide, were also studied under the same condition for 5 consecutive days. The materials were 5 healthy volunteers aging 20-29 years old with out any diabetic past, or family histories. The blood glucose was measured by the Nelson's method, and plasma insulin by the radioimmunoassay method. Following are the results; 1) The plasma insulin concentration in the erect position is slightly higher than in the supine position, however, the increase is statistically insignificant because of the notable individual variations in the values of the supine position. 2) Four out of 5 cases show the increase of about 80% of plasma insulin in the erect position, which is statistically significant if analyzed on the basis of frequency distribution. 3) The blood glucose concentration showed no postural changes. 4) The increase of the plasma insulin concentration in the erect position seems to the result of limited extra vasation of insulin in the lower extremities.
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Directory of Open Access Journals (Sweden)
Prashanth Shetty
2016-03-01
Full Text Available Fasting is one of the fundamental treatments of naturopathy. Use of lemon and honey for various medicinal purposes were documented since ancient days but there is a lack of evidence on short-term effects of lemon honey juice fasting (LHJF. Hence, we aim at evaluating the short-term effect of LHJF on lipid profile and body composition in healthy individuals. A total of 50 healthy subjects were recruited and they received 300-ml of LHJ, 4 times a day for four successive days of fasting. Assessments were performed before and after the intervention. Statistical analysis was performed by student's paired t-test with the use of Statistical Package for the Social Sciences (SPSS version-16. Our study showed significant reduction in weight, body mass index (BMI, fat mass (FM, free FM (FFM, and total serum triglycerides (TSTGs with insignificant reduction in fat percentage and total serum cholesterol compared to baseline. Within group analysis of females showed similar results, unlike males. Our results suggest that LHJF may be useful for reduction of body weight, BMI, FM, FFM, and TSTG in healthy individuals, which might be useful for the prevention of obesity and hypertriglyceridemia.
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Novak, Elena
2012-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. In addition, the study focused on examining the effects of a storyline GC on specific learning…
Quantifying scenarios to check statistical procedures
International Nuclear Information System (INIS)
Beetle, T.M.
1976-01-01
Ways of diverting nuclear material are presented in a form that reflects the effects of the diversions on a select set of statistical accounting procedures. Twelve statistics are examined for changes in mean values under sixty diversion scenarios. Several questions about the statistics are answered using a table of quantification results. Findings include a smallest, proper subset of the set of statistics which has one or more changed mean values under each of the diversion scenarios
Koparan, Timur; Güven, Bülent
2015-07-01
The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.
The effect of experimental streptococcus infection in myocarditis on ...
African Journals Online (AJOL)
EB
faecalis induction of myocarditis and its effect on some blood parameters, inflammatory markers and .... LSD. 41. 5. 0.9. Similar characters denote insignificance between groups. *** denote .... detectable in serum, brain and intestine of rat pups.
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
In vitro potential cytogenetic and oxidative stress effects of roxithromycin.
Arslan, Mehmet; Timocin, Taygun; Ila, Hasan B
2017-10-01
Macrolide antibiotic roxithromycin was evaluated in terms of its genotoxic, cytotoxic and oxidative stress effects. For this purpose; 25, 50, 100 and 200 μg/mL concentrations of roxithromycin were dissolved in dimethyl sulfoxide and treated to human peripheral blood lymphocytes for two different treatment periods (24 and 48 h). In chromosome aberration (CA) and micronucleus (MN) tests, roxithromycin did not show genotoxic effect. But it induced sister chromatid exchange (SCE) at the highest concentration (200 μg/mL) for the 24-h treatment period and at all concentrations (except 25 μg/mL) for the 48-h treatment period. Looking at cytotoxic effect of roxithromycin, statistically insignificant decreases on mitotic index and proliferation index were observed. Roxithromycin decreased nuclear division index (NDI) at highest two concentrations (100 and 200 μg/mL) for the 24-h treatment period and at all concentrations (expect 25 μg/mL) for the 48-h treatment period. Total oxidant values, total antioxidant values and oxidative stress index did not change with roxithromycin treatment. Eventually, roxithromycin did not have genotoxic and oxidative stress effects in human-cultured lymphocytes.
USING STATISTICAL SURVEY IN ECONOMICS
Directory of Open Access Journals (Sweden)
Delia TESELIOS
2012-01-01
Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.
A Statistical Programme Assignment Model
DEFF Research Database (Denmark)
Rosholm, Michael; Staghøj, Jonas; Svarer, Michael
When treatment effects of active labour market programmes are heterogeneous in an observable way across the population, the allocation of the unemployed into different programmes becomes a particularly important issue. In this paper, we present a statistical model designed to improve the present...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the plementation of such a system, especially the interplay between the statistical model and case workers....
Effects of atomoxetine on heart rhythm in children and adolescents.
Tanidir, Ibrahim Cansaran; Tanidir, Canan; Ozturk, Erkut; Bahali, Kayhan; Gunes, Hatice; Ergul, Yakup; Uneri, Ozden Sukran; Akdeniz, Celal; Tuzcu, Volkan
2015-12-01
The aim of this study was to examine the effects of atomoxetine on heart rhythm using 12-lead electrocardiography (ECG) and 24 h Holter monitoring. Children and adolescents who were diagnosed with attention deficit-hyperactivity disorder according to DSM-IV-TR were referred to a pediatric cardiology clinic for cardiologic examination before and after 4 or 5 weeks of atomoxetine treatment. Cardiac examination, complete blood count, biochemistry, thyroid function tests, 12-lead ECG and 24 h Holter monitoring were performed routinely in all patients. Each subject underwent 24 h Holter ECG monitoring before atomoxetine was started and after 4 or 5 weeks of effective dose atomoxetine treatment. Forty-one patients were included in this prospective study. No statistically significant change was found in QT, QTc or QT interval dispersion or blood pressure before and after 4 or 5 weeks of atomoxetine treatment. There was a statistically significant increase in heart rate (both during the day and at night) and QRS duration, and a statistically significant decrease in P wave dispersion. Three patients had rhythm disturbances. All of these three patients were asymptomatic and none of these arrhythmias reached clinical significance. Atomoxetine did not cause significant changes in ECG or Holter variables. In two patients, who had undiagnosed subclinical extrasystoles, extra beats were increased after 4th week of treatment, but still remained clinically insignificant. Before and after atomoxetine treatment, listening to the heart sounds for a longer time, may help clinicians to notice an extra beat. If an extra beat is identified then 24 Holter monitoring is recommended. © 2015 Japan Pediatric Society.
Combustion analysis of preheated crude sunflower oil in an IDI diesel engine
Energy Technology Data Exchange (ETDEWEB)
Canakci, Mustafa; Ozsezen, Ahmet Necati; Turkcan, Ali [Department of Mechanical Education, Kocaeli University, 41380 Izmit (Turkey); Alternative Fuels R and D Center, Kocaeli University, 41040 Izmit (Turkey)
2009-05-15
In this study, preheated crude sunflower oil (PCSO) was tested for combustion and emission properties against petroleum based diesel fuel (PBDF) in a naturally aspirated, indirect injection (IDI) engine. The cylinder gas pressure and heat release curves for PCSO at 75 C were similar to those of PBDF. The ignition delays for the PCSO were longer and the start of injection timing was earlier than for PBDF. The difference in the average brake torque was a decrease of 1.36% for PCSO though this was statistically insignificant. The brake specific fuel consumption increased by almost 5% more or less in proportion to the difference in calorific value, so that the 1.06% increase in thermal efficiency was again statistically insignificant. The emission test results showed that the decreases in CO{sub 2} emissions and smoke opacity 2.05% and 4.66%, respectively; however, this was not statistically significant, though in line with the apparent increase in thermal efficiency. There was a significant 34% improvement in the emissions of unburnt hydrocarbons. Carbon monoxide increased by 1.77% again the result was not statistically significant given the small number of repeat tests. The use of PCSO does not have any negative effects on the engine performance and emissions in short duration engine testing. (author)
Multidimensionality of Longitudinal Data: Unlocking the Age-Happiness Puzzle
Ning Li
2014-01-01
In social and economic analysis of longitudinal data, the socio-economic variables that are statistically significant in pooled data regressions sometimes become insignificant after individual fixed effects are controlled for. This phenomenon has been observed in the analysis of the relationship between age and happiness. The discrepancy in results between regressions with and without controlling for individual fixed effects is sometimes known as a mystery in the research of age and happiness...
The Euclid Statistical Matrix Tool
Directory of Open Access Journals (Sweden)
Curtis Tilves
2017-06-01
Full Text Available Stataphobia, a term used to describe the fear of statistics and research methods, can result from a lack of improper training in statistical methods. Poor statistical methods training can have an effect on health policy decision making and may play a role in the low research productivity seen in developing countries. One way to reduce Stataphobia is to intervene in the teaching of statistics in the classroom; however, such an intervention must tackle several obstacles, including student interest in the material, multiple ways of learning materials, and language barriers. We present here the Euclid Statistical Matrix, a tool for combatting Stataphobia on a global scale. This free tool is comprised of popular statistical YouTube channels and web sources that teach and demonstrate statistical concepts in a variety of presentation methods. Working with international teams in Iran, Japan, Egypt, Russia, and the United States, we have also developed the Statistical Matrix in multiple languages to address language barriers to learning statistics. By utilizing already-established large networks, we are able to disseminate our tool to thousands of Farsi-speaking university faculty and students in Iran and the United States. Future dissemination of the Euclid Statistical Matrix throughout the Central Asia and support from local universities may help to combat low research productivity in this region.
Statistics for X-chromosome associations.
Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor
2018-06-13
In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.
Statistical Investigation of the User Effects on Mobile Terminal Antennas for 5G Applications
DEFF Research Database (Denmark)
Syrytsin, Igor A.; Zhang, Shuai; Pedersen, Gert F.
2017-01-01
In this paper the user effects on mobile terminal antennas at 28GHz are statistically investigated with the parameters of body loss, coverage efficiency and power in the shadow. The data are obtained from the measurements of 12 users in data and talk modes, with the antenna placed on the top...
Zuhurudeen, Fathima Manaar; Huang, Yi Ting
2016-03-01
Empirical evidence for statistical learning comes from artificial language tasks, but it is unclear how these effects scale up outside of the lab. The current study turns to a real-world test case of statistical learning where native English speakers encounter the syntactic regularities of Arabic through memorization of the Qur'an. This unique input provides extended exposure to the complexity of a natural language, with minimal semantic cues. Memorizers were asked to distinguish unfamiliar nouns and verbs based on their co-occurrence with familiar pronouns in an Arabic language sample. Their performance was compared to that of classroom learners who had explicit knowledge of pronoun meanings and grammatical functions. Grammatical judgments were more accurate in memorizers compared to non-memorizers. No effects of classroom experience were found. These results demonstrate that real-world exposure to the statistical properties of a natural language facilitates the acquisition of grammatical categories. Copyright © 2015 Elsevier B.V. All rights reserved.
Effects of Isospin mixing on statistical properties of 26Al and 30P
International Nuclear Information System (INIS)
Shriner, J.F. Jr.; Blackston, M.A.; Mahar, K.T.; Grossmann, C.A.; Mitchell, G.E.
2000-01-01
Odd-odd nuclides in the sd-shell have states of different isospin coexisting even near the ground state. Because isospin is not a perfect symmetry, this coexistence provides an opportunity to examine directly the effects of a broken symmetry on the statistical properties of a quantum system. We present results for the nuclides 26 Al and 30 P, for which the level schemes are relatively complete
An Update on Statistical Boosting in Biomedicine.
Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf
2017-01-01
Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
A Study of the Effectiveness of Web-Based Homework in Teaching Undergraduate Business Statistics
Palocsay, Susan W.; Stevens, Scott P.
2008-01-01
Web-based homework (WBH) Technology can simplify the creation and grading of assignments as well as provide a feasible platform for assessment testing, but its effect on student learning in business statistics is unknown. This is particularly true of the latest software development of Web-based tutoring agents that dynamically evaluate individual…
Pliske, Rebecca M.; Caldwell, Tracy L.; Calin-Jageman, Robert J.; Taylor-Ritzler, Tina
2015-01-01
We developed a two-semester series of intensive (six-contact hours per week) behavioral research methods courses with an integrated statistics curriculum. Our approach includes the use of team-based learning, authentic projects, and Excel and SPSS. We assessed the effectiveness of our approach by examining our students' content area scores on the…
Permutation statistical methods an integrated approach
Berry, Kenneth J; Johnston, Janis E
2016-01-01
This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...
Effect of the image resolution on the statistical descriptors of heterogeneous media
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost
International Nuclear Information System (INIS)
Peebles, D.E.; Ohlhausen, J.A.; Kotula, P.G.; Hutton, S.; Blomfield, C.
2004-01-01
The acquisition of spectral images for x-ray photoelectron spectroscopy (XPS) is a relatively new approach, although it has been used with other analytical spectroscopy tools for some time. This technique provides full spectral information at every pixel of an image, in order to provide a complete chemical mapping of the imaged surface area. Multivariate statistical analysis techniques applied to the spectral image data allow the determination of chemical component species, and their distribution and concentrations, with minimal data acquisition and processing times. Some of these statistical techniques have proven to be very robust and efficient methods for deriving physically realistic chemical components without input by the user other than the spectral matrix itself. The benefits of multivariate analysis of the spectral image data include significantly improved signal to noise, improved image contrast and intensity uniformity, and improved spatial resolution - which are achieved due to the effective statistical aggregation of the large number of often noisy data points in the image. This work demonstrates the improvements in chemical component determination and contrast, signal-to-noise level, and spatial resolution that can be obtained by the application of multivariate statistical analysis to XPS spectral images
[Statistics for statistics?--Thoughts about psychological tools].
Berger, Uwe; Stöbel-Richter, Yve
2007-12-01
Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.
A panel data analysis of the determinants of oil consumption: The case of Australia
International Nuclear Information System (INIS)
Narayan, Paresh Kumar; Wong, Philip
2009-01-01
The goal of this paper is to examine the determinants of oil consumption for a panel consisting of six Australian States and one territory, namely Queensland, New South Wales, Victoria, Tasmania, South Australia, Western Australia, and the Northern territory, for the period 1985-2006. We find that oil consumption, oil prices and income are panel cointegrated. We estimate long-run elasticities and find that oil prices have had a statistically insignificant impact on oil consumption, while income has had a statistically significant positive effect on oil consumption. (author)
The Effects of Flare Definitions on the Statistics of Derived Flare Distrubtions
Ryan, Daniel; Dominique, Marie; Seaton, Daniel B.; Stegen, Koen; White, Arthur
2016-05-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. However, statistical flare studies are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds which may affect the derived flare distributions. We explore the effect of the arbitrary thresholds used in the GOES event list and LYRA Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the algorithms’ flare start thresholds. We also find that the power law exponents of these distributions are not stable but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are clearly non-power law. We show that this is consistent with an insufficient degradation correction which causes LYRA absolute irradiance values to be unreliable. This means that they should not be used for flare statistics or energetics unless degradation is adequately accounted for. However they can be used to study time variations over shorter timescales and for space weather monitoring.
Anwer, S; Equebal, A; Nezamuddin, M; Kumar, R; Lenka, P K
2013-09-01
The objective of this trial was to evaluate the effect of gender on strength gains after five week training programme that consisted of isometric exercise coupled with electromyographic biofeedback to the quadriceps muscle. Forty-three (20 men and 23 women) patients with knee osteoarthritis (OA), were placed into two groups based on their gender. Both groups performed isometric exercise coupled with electromyographic biofeedback for five days a week for five weeks. Both groups reported gains in muscle strength after five week training. However, the difference was found to be statistically insignificant between the two groups (P=0.224). The results suggest that gender did not affect gains in muscle strength by isometric exercise coupled with electromyographic biofeedback in patients with knee OA. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Inferring Demographic History Using Two-Locus Statistics.
Ragsdale, Aaron P; Gutenkunst, Ryan N
2017-06-01
Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.
Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki
2017-01-01
Abstract The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD). We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC. FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = −0.584 and r = −0.568, respectively, both P system can predict FFR at an optimal cut-off of <0.80, and we propose a novel application of CT-AC to MPI-IQ-SPECT for predicting clinically significant and insignificant FFR even in nonobese patients. PMID:29390486
An Update on Statistical Boosting in Biomedicine
Directory of Open Access Journals (Sweden)
Andreas Mayr
2017-01-01
Full Text Available Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting. In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.
Koparan, Timur; Güven, Bülent
2015-01-01
The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35…
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Developing Statistical Evaluation Model of Introduction Effect of MSW Thermal Recycling
Aoyama, Makoto; Kato, Takeyoshi; Suzuoki, Yasuo
For the effective utilization of municipal solid waste (MSW) through a thermal recycling, new technologies, such as an incineration plant using a Molten Carbonate Fuel Cell (MCFC), are being developed. The impact of new technologies should be evaluated statistically for various municipalities, so that the target of technological development or potential cost reduction due to the increased cumulative number of installed system can be discussed. For this purpose, we developed a model for discussing the impact of new technologies, where a statistical mesh data set was utilized to estimate the heat demand around the incineration plant. This paper examines a case study by using a developed model, where a conventional type and a MCFC type MSW incineration plant is compared in terms of the reduction in primary energy and the revenue by both electricity and heat supply. Based on the difference in annual revenue, we calculate the allowable investment in MCFC-type MSW incineration plant in addition to conventional plant. The results suggest that allowable investment can be about 30 millions yen/(t/day) in small municipalities, while it is only 10 millions yen/(t/day) in large municipalities. The sensitive analysis shows the model can be useful for discussing the difference of impact of material recycling of plastics on thermal recycling technologies.
Statistical Thermodynamics and Microscale Thermophysics
Carey, Van P.
1999-08-01
Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Effect of moulding sand on statistically controlled hybrid rapid casting solution for zinc alloys
Energy Technology Data Exchange (ETDEWEB)
Singh, Rupinder [Guru Nanak Dev Engineering College, Ludhiana (India)
2010-08-15
The purpose of the present investigations is to study the effect of moulding sand on decreasing shell wall thickness of mould cavities for economical and statistically controlled hybrid rapid casting solutions (combination of three dimensional printing and conventional sand casting) for zinc alloys. Starting from the identification of component/ benchmark, technological prototypes were produced at different shell wall thicknesses supported by three different types of sands (namely: dry, green and molasses). Prototypes prepared by the proposed process are for assembly check purpose and not for functional validation of the parts. The study suggested that a shell wall with a less than recommended thickness (12mm) is more suitable for dimensional accuracy. The best dimensional accuracy was obtained at 3mm shell wall thickness with green sand. The process was found to be under statistical control
International Nuclear Information System (INIS)
Pinotti, E.; Brenna, M.; Puppin, E.
2008-01-01
In magneto-optical Kerr measurements of the Barkhausen noise, a magnetization jump ΔM due to a domain reversal produces a variation ΔI of the intensity of a laser beam reflected by the sample, which is the physical quantity actually measured. Due to the non-uniform beam intensity profile, the magnitude of ΔI depends both on ΔM and on its position on the laser spot. This could distort the statistical distribution p(ΔI) of the measured ΔI with respect to the true distribution p(ΔM) of the magnetization jumps ΔM. In this work the exact relationship between the two distributions is derived in a general form, which will be applied to some possible beam profiles. It will be shown that in most cases the usual Gaussian beam produces a negligible statistical distortion. Moreover, for small ΔI the noise of the experimental setup can also distort the statistical distribution p(ΔI), by erroneously rejecting small ΔI as noise. This effect has been calculated for white noise, and it will be shown that it is relatively small but not totally negligible as the measured ΔI approaches the detection limit
Novak, Elena; Johnson, Tristan E.; Tenenbaum, Gershon; Shute, Valerie J.
2016-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. A storyline is a game-design element that connects scenes with the educational content. In order to…
Directory of Open Access Journals (Sweden)
Takahiro eKawabe
2013-09-01
Full Text Available Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent’s action, and thus humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was greatly reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation.
Genton, Marc G.
2015-04-14
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.
Extending statistical boosting. An overview of recent methodological developments.
Mayr, A; Binder, H; Gefeller, O; Schmid, M
2014-01-01
Boosting algorithms to simultaneously estimate and select predictor effects in statistical models have gained substantial interest during the last decade. This review highlights recent methodological developments regarding boosting algorithms for statistical modelling especially focusing on topics relevant for biomedical research. We suggest a unified framework for gradient boosting and likelihood-based boosting (statistical boosting) which have been addressed separately in the literature up to now. The methodological developments on statistical boosting during the last ten years can be grouped into three different lines of research: i) efforts to ensure variable selection leading to sparser models, ii) developments regarding different types of predictor effects and how to choose them, iii) approaches to extend the statistical boosting framework to new regression settings. Statistical boosting algorithms have been adapted to carry out unbiased variable selection and automated model choice during the fitting process and can nowadays be applied in almost any regression setting in combination with a large amount of different types of predictor effects.
A generalization of Friedman's rank statistic
Kroon, de J.; Laan, van der P.
1983-01-01
In this paper a very natural generalization of the two·way analysis of variance rank statistic of FRIEDMAN is given. The general distribution-free test procedure based on this statistic for the effect of J treatments in a random block design can be applied in general two-way layouts without
Energy Technology Data Exchange (ETDEWEB)
1990-05-01
This report is a collection of papers documenting presentations made at the VIII ASA (American Statistical Association) Conference on Radiation and Health entitled Health Effects of Electric and Magnetic Fields: Statistical Support for Research Strategies. Individual papers are abstracted and indexed for the database.
Zeno dynamics in quantum statistical mechanics
International Nuclear Information System (INIS)
Schmidt, Andreas U
2003-01-01
We study the quantum Zeno effect in quantum statistical mechanics within the operator algebraic framework. We formulate a condition for the appearance of the effect in W*-dynamical systems, in terms of the short-time behaviour of the dynamics. Examples of quantum spin systems show that this condition can be effectively applied to quantum statistical mechanical models. Furthermore, we derive an explicit form of the Zeno generator, and use it to construct Gibbs equilibrium states for the Zeno dynamics. As a concrete example, we consider the X-Y model, for which we show that a frequent measurement at a microscopic level, e.g. a single lattice site, can produce a macroscopic effect in changing the global equilibrium
Energy Technology Data Exchange (ETDEWEB)
Franconetti, P.; Candel, J. J.; Vicente, A.; Amigo, V.
2013-07-01
Niobium and tantalum are added to titanium alloys to form new beta alloys with higher biocompatibility for biomedical applications. Both elements have a high melting point, that is the reason for their limited solid state diffusion. In this work samples of titanium with 3% at. niobium and tantalum have been manufactured by powder metallurgy. The effect of the compacting pressure, temperature and the sintering time on the strength, elasticity and ductility in bending has been studied. The results show that both elements behave similarly: flexural strength increases between 20-25%, elasticity between 0-10% and ductility over 150%. Therefore, the addition of these elements is beneficial to mechanical properties. Statistical analysis shows that the effect of temperature and pressure are important, while the effect of time is insignificant and even harmful in these alloys. (Author)
Directory of Open Access Journals (Sweden)
Zheng Xie
2013-01-01
Full Text Available The intrinsic variability of nanoscale VLSI technology must be taken into account when analyzing circuit designs to predict likely yield. Monte-Carlo- (MC- and quasi-MC- (QMC- based statistical techniques do this by analysing many randomised or quasirandomised copies of circuits. The randomisation must model forms of variability that occur in nano-CMOS technology, including “atomistic” effects without intradie correlation and effects with intradie correlation between neighbouring devices. A major problem is the computational cost of carrying out sufficient analyses to produce statistically reliable results. The use of principal components analysis, behavioural modeling, and an implementation of “Statistical Blockade” (SB is shown to be capable of achieving significant reduction in the computational costs. A computation time reduction of 98.7% was achieved for a commonly used asynchronous circuit element. Replacing MC by QMC analysis can achieve further computation reduction, and this is illustrated for more complex circuits, with the results being compared with those of transistor-level simulations. The “yield prediction” analysis of SRAM arrays is taken as a case study, where the arrays contain up to 1536 transistors modelled using parameters appropriate to 35 nm technology. It is reported that savings of up to 99.85% in computation time were obtained.
Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie
2013-01-01
Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.
An experimental study of radioprotective effect of ginseng alkaloid fraction on cellular damage
Energy Technology Data Exchange (ETDEWEB)
Yoo, Seong Yul; Cho, Chul Koo; Kim, Mi Sook; Yoo, Hyung Jun; Kim, Seong Ho; Kim, Tae Hwan [Korea Cancer Center Hospital, Seoul (Korea, Republic of)
1997-09-01
This paper is to assess the effect of Adaptagen as a radioprotector in which main component is alkaloid fraction of ginseng. Evaluation was made in vitro and in vivo study with NIGP(S) mouse by the measurement of regeneration of jejunal crypt cell and micronucleus assay to analyze radioprotective effect of ginseng alkaloid fraction in comparison with that of water fraction after whole body irradiation. The results were as follows, 1. The degree of radiation damage of mouse jejunal crypt cell was diminished in both of alkaloid and water fraction groups compared to control group but more in alkaloid fraction group than water fraction group. 2. Regeneration of mouse jejunal crypt cell was higher both in alkaloid and water fraction groups than control group. 3. In vitro study, frequency of micronucleus was diminished in tendency for the treated groups than control group but statistically insignificant. 4. In vitro study, frequency of micronucleus was diminished in both alkaloid and water fraction groups compared to control group but more in alkaloid fraction group than water fraction group.
Peculiar velocity effects in high-resolution microwave background experiments
International Nuclear Information System (INIS)
Challinor, Anthony; Leeuwen, Floor van
2002-01-01
We investigate the impact of peculiar velocity effects due to the motion of the solar system relative to the cosmic microwave background (CMB) on high resolution CMB experiments. It is well known that on the largest angular scales the combined effects of Doppler shifts and aberration are important; the lowest Legendre multipoles of total intensity receive power from the large CMB monopole in transforming from the CMB frame. On small angular scales aberration dominates and is shown here to lead to significant distortions of the total intensity and polarization multipoles in transforming from the rest frame of the CMB to the frame of the solar system. We provide convenient analytic results for the distortions as series expansions in the relative velocity of the two frames, but at the highest resolutions a numerical quadrature is required. Although many of the high resolution multipoles themselves are severely distorted by the frame transformations, we show that their statistical properties distort by only an insignificant amount. Therefore, the cosmological parameter estimation is insensitive to the transformation from the CMB frame (where theoretical predictions are calculated) to the rest frame of the experiment
Directory of Open Access Journals (Sweden)
Timo Alaviuhkola
1993-12-01
Full Text Available Two different grinding methods - rolling and hammer milling - as well as polyacrylate supplement in the diet were studied to evaluate their effect on the performance of pigs and the incidence of gastric lesions. The experiment was carried out in 2 x 2 factorial arrangement with a total of 160 pigs. The grist size of rolled barley was bigger than of hammermilled barley, but the difference in water-binding capacity was insignificant. No significant differences were observed in the performance traits of pigs fed either rolled or hammer-milled barley. The sodium polyacrylate supplement had no effect on the daily gain, feed:gain ratio or carcass quality of the pigs. Gastric ulcers and constrictions of the oesophageal opening of the stomach were more frequent in the groups fed hammer-milled barley than in the groups fed rolled barley, the difference being statistically significant (P
An Analysis of the Effectiveness of the Constructivist Approach in Teaching Business Statistics
Directory of Open Access Journals (Sweden)
Greeni Maheshwari
2017-05-01
Full Text Available Aim/Purpose: The main aim of the research is to examine the performance of second language English speaking students enrolled in the Business Statistics course and to investigate the academic performance of students when taught under the constructivist and non-constructivist approaches in a classroom environment. Background: There are different learning theories that are established based on how students learn. Each of these theories has its own benefits based on the different type of learners and context of the environment. The students in this research are new to the University environment and to a challenging technical course like Business Statistics. This research has been carried out to see the effectiveness of the constructivist approach in motivating and increasing the student engagement and their academic performance. Methodology\t: A total of 1373 students were involved in the quasi-experiment method using Stratified Sampling Method from the year 2015 until 2016. Contribution: To consider curriculum adjustments for first year programs and implications for teacher education. Findings: The t-test for unequal variances was used to understand the mean score. Results indicate students have high motivation level and achieve higher mean scores when they are taught using the constructivist teaching approach compared to the non-constructivist teaching approach. Recommendations for Practitioners: To consider the challenges faced by first year students and create a teaching approach that fits their needs. Recommendation for Researchers: To explore in depth other teaching approaches of the Business Statistics course in improving students’ academic performance. Impact on Society\t: The constructivist approach will enable learning to be enjoyable and students to be more confident. Future Research: The research will assist other lectures teaching Business Statistics in creating a more conducive environment to encourage second language English
Statistical learning in social action contexts.
Monroy, Claire; Meyer, Marlene; Gerson, Sarah; Hunnius, Sabine
2017-01-01
Sensitivity to the regularities and structure contained within sequential, goal-directed actions is an important building block for generating expectations about the actions we observe. Until now, research on statistical learning for actions has solely focused on individual action sequences, but many actions in daily life involve multiple actors in various interaction contexts. The current study is the first to investigate the role of statistical learning in tracking regularities between actions performed by different actors, and whether the social context characterizing their interaction influences learning. That is, are observers more likely to track regularities across actors if they are perceived as acting jointly as opposed to in parallel? We tested adults and toddlers to explore whether social context guides statistical learning and-if so-whether it does so from early in development. In a between-subjects eye-tracking experiment, participants were primed with a social context cue between two actors who either shared a goal of playing together ('Joint' condition) or stated the intention to act alone ('Parallel' condition). In subsequent videos, the actors performed sequential actions in which, for certain action pairs, the first actor's action reliably predicted the second actor's action. We analyzed predictive eye movements to upcoming actions as a measure of learning, and found that both adults and toddlers learned the statistical regularities across actors when their actions caused an effect. Further, adults with high statistical learning performance were sensitive to social context: those who observed actors with a shared goal were more likely to correctly predict upcoming actions. In contrast, there was no effect of social context in the toddler group, regardless of learning performance. These findings shed light on how adults and toddlers perceive statistical regularities across actors depending on the nature of the observed social situation and the
Research, statistics and mathematics educators in Nigeria: effect ...
African Journals Online (AJOL)
Over reliance on the perspective of a dichotomous reject or fail-to-reject outcome from a null hypothesis testing framework to answer research questions has become a worrisome issue to research methodologists and statistics experts. Thus, the Journals of Mathematical Association of Nigeria, Abacus (2013 & 2014) were ...
Analysis of Variance: What Is Your Statistical Software Actually Doing?
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
Effect of time, dose and fractionation on local control of nasopharyngeal carcinoma
International Nuclear Information System (INIS)
Lee, Anne W.M.; Chan, David K.K.; Poon, Y.F.; Foo, William; Law, Stephen C.K.; O, S.K.; Tung, Stewart Y.; Fowler, Jack F.; Chappell, Rick
1995-01-01
To study the effect of radiation factors on local control of nasopharyngeal carcinoma, 1008 patients with similarly staged T1N0-3M0 disease (Ho's classification) were retrospectively analyzed. All patients were treated by megavoltage irradiation alone using the same technique. Four different fractionation schedules had been used sequentially during 1976-1985: with total dose ranging from 45.6 to 60 Gy and fractional dose from 2.5 to 4.2 Gy. The median overall time was 39 days (range = 38-75 days). Both for the whole series and 763 patients with nodal control, total dose was the most important radiation factor. The hazard of local failure decreased by 9% per additional Gy (p < 0.01). Biological equivalents expressed in terms of Biologically Effective Dose or Nominal Standard Dose also showed strong correlation. Fractional dose had no significant impact. The effect of overall treatment time was insignificant for the whole series, but almost reached statistical significance for those with nodal control (p = 0.06). Further study is required for elucidation, as 85% of patients completed treatment within a very narrow range (38-42 days), and the possible hazard is clinically too significant to be ignored
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Feiveson, Alan H.; Ploutz-Snyder, Robert; Fiedler, James
2011-01-01
In their 2009 Annals of Statistics paper, Gavrilov, Benjamini, and Sarkar report the results of a simulation assessing the robustness of their adaptive step-down procedure (GBS) for controlling the false discovery rate (FDR) when normally distributed test statistics are serially correlated. In this study we extend the investigation to the case of multiple comparisons involving correlated non-central t-statistics, in particular when several treatments or time periods are being compared to a control in a repeated-measures design with many dependent outcome measures. In addition, we consider several dependence structures other than serial correlation and illustrate how the FDR depends on the interaction between effect size and the type of correlation structure as indexed by Foerstner s distance metric from an identity. The relationship between the correlation matrix R of the original dependent variables and R, the correlation matrix of associated t-statistics is also studied. In general R depends not only on R, but also on sample size and the signed effect sizes for the multiple comparisons.
Statistical mechanics for a class of quantum statistics
International Nuclear Information System (INIS)
Isakov, S.B.
1994-01-01
Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived
Directory of Open Access Journals (Sweden)
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
Finding the Best Feature Detector-Descriptor Combination
DEFF Research Database (Denmark)
Dahl, Anders Lindbjerg; Aanæs, Henrik; Pedersen, Kim Steenstrup
2011-01-01
, not statistically significantly better than some other methods. As a byproduct of this investigation, we have also tested various DAISY type descriptors, and found that the difference among their performance is statistically insignificant using this dataset. Furthermore, we have not been able to produce results...
Statistical Power in Plant Pathology Research.
Gent, David H; Esker, Paul D; Kriss, Alissa B
2018-01-01
In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-08-01
To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will
The Effect of Gender on Stress Factors: An Exploratory Study among University Students
Directory of Open Access Journals (Sweden)
Michelle Calvarese
2015-11-01
Full Text Available This study examined the relationship between gender and reactions to stress among university students. University students were surveyed on how they typically responded when under perceived stress. There were significant differences between males and females concerning their reactions to stress. Overall, more females experienced higher levels of depression, frustration, and anxiety than their male counterparts when reacting to stress. Males also tended to have other psychological reactions different from those listed on the survey. In addition, while the stress reaction of anger was barely statistically insignificant, more females expressed anger than males as a reaction to stress.
Festing, Michael F W
2014-01-01
The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.
Directory of Open Access Journals (Sweden)
Michael F W Festing
Full Text Available The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.
Effect of Task Presentation on Students' Performances in Introductory Statistics Courses
Tomasetto, Carlo; Matteucci, Maria Cristina; Carugati, Felice; Selleri, Patrizia
2009-01-01
Research on academic learning indicates that many students experience major difficulties with introductory statistics and methodology courses. We hypothesized that students' difficulties may depend in part on the fact that statistics tasks are commonly viewed as related to the threatening domain of math. In two field experiments which we carried…
Statistics with JMP graphs, descriptive statistics and probability
Goos, Peter
2015-01-01
Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic
Effects of Different Practice Conditions on Advanced Instrumentalists' Performance Accuracy.
Rosenthal, Roseanne K.; And Others
1988-01-01
Examines the relative effects of the five practice conditions of modeling, singing, silent analysis, free practice, and control on instrumentalists' performance. Finds significant differences in subjects' performance of correct rhythms, phrasing or dynamics, and tempo, and insignificant differences among performances of correct notes and…
Intestinal parasites and tuberculosis
Directory of Open Access Journals (Sweden)
Anuar Alonso Cedeño-Burbano
2017-10-01
Conclusions: The available evidence was insufficient to affirm that intestinal parasites predispose to developing tuberculous. The studies carried out so far have found statistically insignificant results.
Directory of Open Access Journals (Sweden)
Nasir Ali
2014-04-01
Full Text Available The aim of this empirical study is to analyse the impact of Corporate Governance on Capital Structure Decisions in Saudi Arabian commercial banking sector. The components of corporate governance whose impact has been analysed on the capital structure are board size, independence of directors, ownership structure, ownership of management, board meetings. Multiple regression analysis, Correlation matrix and Descriptive Statistics is used to assess the relationship among corporate governance components and capital structure of Saudi commercial banks for the years 2010 and 2011. The results shows that ownership structure and board size are positively correlated which is coherent with most of the previous studies. Managerial ownership and board independence are negatively correlated and board meeting held in a year is also negatively correlated but is statistically insignificant. Moreover the study found that on average the Saudi banks uses 68 % debt capital. The research study is supposed to facilitate regulatory authorities like CMA for improving the implementation of rules and regulations in order to make corporate governance tools work more efficiently in the Kingdom of Saudi Arabia. The research study evaluates the effects of corporate governance components on capital structure decisions of Saudi commercial banks.
International Nuclear Information System (INIS)
Mäkelä, Mikko; Yoshikawa, Kunio
2016-01-01
Highlights: • Effect of treatment conditions on composition and solubility of ash. • Ash dissolution and yield governed by liquid pH and calcium carbonate solubility. • Dissolution of calcium carbonate decreases ash fusion temperature during combustion. • Decreasing the ash content of sludge can weaken ash properties for combustion. - Abstract: This second half of our work on ash behavior concentrates on the effects of hydrothermal treatment conditions on paper sludge. Ash composition and solubility were determined based on treatment temperature, reactor solid load and liquid pH using experimental design and univariate regression methods. In addition, ash properties for combustion were evaluated based on recent developments on ash classification. Based on the results, all experimental variables had a statistically significant effect on ash yields. Only reactor solid load was statistically insignificant for char ash content, which increased based on increasing treatment temperature due to the decomposition of organic components. Ash dissolution and ash yield were governed by liquid pH and the generation of acids mainly due to the solubility of calcium carbonate identified as the main mineral species of paper sludge. Dissolution of calcium carbonate however decreased ash fusion temperatures more likely causing problems during char incineration. This indicated that decreasing the ash content of sludge during hydrothermal treatment can actually weaken ash properties for solid fuel applications.
Directory of Open Access Journals (Sweden)
Laszlo IRSAY
2010-12-01
Full Text Available Purpose: Studying the effectiveness of chondroprotective agents for patients with primary knee arthritis or primary generalized osteoarthritis, according to the American College of Rheumatology 2000 criteria. Material and Methods: comparative study, the groups were constituted out of 25 patients in the study group and 15 patients in the control group. The patients were evaluated with the WOMAC test, Lequesne, cross-linked C-terminal (CTX telopeptide of type I collagen on inclusion, at 6 and 12 months and through bilateral- knee radiography, using the Kellgren-Lawrence classification on inclusion and 12 months later. Patients from the study group received a chondroprotectiv agent orally for 12 months. Results: WOMAC score was improved in the study group at 6 and 12 months -4.1 (CI -6.1 to -2.1 and -5.9 (CI -8 to -3.8 compared to the control group 1.5 (CI -0.7 to 3.7 and 2 (CI -0.2 to 4.2, with a statistical significance p=0.02. There has also been an amelioration of the Lequesne score in the study group at 6 and 12 months -3.8 (CI -6.3 to -1.3 and -6.2 (CI -9.1 to -3.3, and the control group 1.3 (CI -1.5 to 4.1 to 6 months and 1.9 (CI -0.8 to 4.6 to 12 months, with a statistical significance p=0.03. No adverse reactions were registered. Conclusions: The chondroprotective agent was effective in improving the function of patients with osteoarthritis, the studied marker cannot be used to monitor the treatment effectiveness, and the radiological modifications in the knee are statistically insignificant after 12 months of monitoring.
Is education the best contraception: the case of teenage pregnancy in England?
Girma, Sourafel; Paton, David
2015-04-01
This paper examines potential explanations for recent declines in teenage pregnancy in England. We estimate panel data models of teenage conception, birth and abortion rates from regions in England. Although point estimates are consistent with the promotion of long acting reversible contraception (LARC) having a negative impact on teenage pregnancy rates, the effects are generally small and statistically insignificant. In contrast, improvements in educational achievement and, to a lesser extent, increases in the non-white proportion of the population are associated with large and statistically significant reductions in teenage pregnancy. Copyright © 2015 Elsevier Ltd. All rights reserved.
Effects of botropase on clotting factors in healthy human volunteers
Directory of Open Access Journals (Sweden)
Ashok K Shenoy
2014-01-01
Full Text Available Objective: To evaluate the effects of botropase on various clotting factors in human volunteers. Materials and Methods: It was a prospective open label study conducted on human healthy volunteers. After the baseline screening, subjects fulfilling inclusion criteria were enrolled. On the study day, 1 ml of botropase was administered intravenously and after an hour same dose of botropase (1 ml was given by intramuscular (IM route. The efficacy and safety parameters were monitored up to 72 h from the time of intravenous (IV administration. Results: A total of 15 volunteers, belonging to 24-35 years of age were included in the study. Botropase significantly reduced the plasma level of fibrinogen and fibrin degradation products after 5 min of IV administration (P < 0.05. In addition, factor X was observed to reduce constantly by botropase administration suggesting enhanced turnover between 5 and 20 min of IV administration. Although botropase reduced clotting and bleeding time in all the volunteers, the data remains to be statistically insignificant. Conclusion: Present study demonstrated the safety and efficacy of botropase in human healthy volunteers. The study has shown that it is a factor X activator and reduces effectively clotting and bleeding time.
Effects of boundary conditions on thermomechanical calculations: Spent fuel test - climax
International Nuclear Information System (INIS)
Butkovich, T.R.
1982-10-01
The effects of varying certain boundary conditions on the results of finite-element calculations were studied in relation to the Spent Fuel Test - Climax. The study employed a thermomechanical model with the ADINA structural analysis. Nodal temperature histories were generated with the compatible ADINAT heat flow codes. The boundary conditions studied included: (1) The effect of boundary loading on three progressively larger meshes. (2) Plane strain vs plane stress conditions. (3) The effect of isothermal boundaries on a small mesh and on a significantly larger mesh. The results showed that different mesh sizes had an insignificant effect on isothermal boundaries up to 5 y, while on the smallest and largest mesh, the maximum temperature difference in the mesh was 0 C. In the corresponding ADINA calculation, these different mesh sizes produce insignificant changes in the stress field and displacements in the region of interest near the heat sources and excavations. On the other hand, plane stress produces horizontal and vertical stress differences approx. 9% higher than does plane strain
Statistics Anxiety and Business Statistics: The International Student
Bell, James A.
2008-01-01
Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…
Hyun, Kyung Sun; Kang, Hyun Sook; Kim, Won Ock; Park, Sunhee; Lee, Jia; Sok, Sohyune
2009-04-01
The purpose of this study was to develop a multimedia learning program for patients with diabetes mellitus (DM) diet education using standardized patients and to examine the effects of the program on educational skills, communication skills, DM diet knowledge and learning satisfaction. The study employed a randomized control posttest non-synchronized design. The participants were 108 third year nursing students (52 experimental group, 56 control group) at K university in Seoul, Korea. The experimental group had regular lectures and the multimedia learning program for DM diet education using standardized patients while the control group had regular lectures only. The DM educational skills were measured by trained research assistants. The students who received the multimedia learning program scored higher for DM diet educational skills, communication skills and DM diet knowledge compared to the control group. Learning satisfaction of the experimental group was higher than the control group, but statistically insignificant. Clinical competency was improved for students receiving the multimedia learning program for DM diet education using standardized patients, but there was no statistically significant effect on learning satisfaction. In the nursing education system there is a need to develop and apply more multimedia materials for education and to use standardized patients effectively.
Antonelli, Alessandro; Vismara Fugini, Andrea; Tardanico, Regina; Giovanessi, Luca; Zambolin, Tiziano; Simeone, Claudio
2014-01-01
To find out which factors could predict the diagnosis of insignificant prostate cancer (ins-PCa) according to a recently updated definition (overall tumor volume up to 2.5 cm(3); final Gleason score ≤6; organ-confined disease) on a prostatic biopsy specimen. This was a retrospective analysis of 210 patients undergoing radical prostatectomy for a cT1c prostate neoplasm with a biopsy specimen Gleason score of ≤6. A logistic regression model was used to assess the differences in the distribution of some possibly predictive factors between the ins-PCa patients, according to the updated definition, and the remaining patients. By applying an updated definition of ins-PCa, the prevalence of this condition increased from 13.3% to 49.5% (104 of 210 patients). The univariate analysis showed a statistically different distribution of the following factors: prostate-specific antigen density, prostate volume, number of cancer-involved cores, and maximum percentage of core involvement by cancer. At the multivariable analysis, the maximum percentage of involvement of the core retained its relevance (27.0% in ins-PCa patients and 43.8% in the remaining patients; hazard ratio, 0.972; P = .046), and a 20% cutoff was detected. In a cohort of patients with PCa cT1c and a biopsy specimen Gleason score of ≤6, the ins-PCa rate, according to the updated definition, is close to 50%, and the percentage of cancer involvement of the core is the single factor that best predicts this diagnosis. Copyright © 2014 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Sneha Anil Rajguru
2017-01-01
Full Text Available Background: Dentin hypersensitivity results when patent tubules are exposed to pain-inducing external stimuli. Aim: This study aims to compare the effects of two desensitizing dentifrices containing NovaMin and arginine on dentinal tubule occlusion with and without citric acid challenge in vitro using confocal laser scanning microscopy (CLSM. Materials and Methods: Forty dentin discs were randomly divided into Groups I and II containing twenty specimens each, treated with NovaMin and arginine-containing dentifrices, respectively. Groups I and II were divided into subgroups A and B where IA and IIA underwent CLSM analysis to determine the percentage of tubule occlusion while IB and IIB underwent 0.3% citric acid challenge and CLSM analysis. A novel grading system was devised to categorize tubule occlusion. Results: In Group II, the percentage of occluded tubules was highest for IIA (72.25% ± 10.57% and least for IIB (42.55% ± 8.65% having statistical significance (P < 0.0005. In Group I, the difference between IA (49.9% ± 12.96% and IB (43.15% ± 12.43% was statistically insignificant (P = 0.249. On the comparison between IB and IIB statistically indifferent result was obtained (P = 0.901, whereas the difference between IA and IIA was statistically significant (P < 0.001. The results of grading system were for IA 50% of samples belonged to Grade 2, for IIA 60% - Grade 3, and for IB 70% and for IIB 90% - Grade 2. Conclusion: Dentinal tubule occlusion with arginine-containing dentifrice was significantly higher than NovaMin. However, it could not resist citric acid challenge as effectively as NovaMin. The effects of NovaMin were more sustainable as compared to arginine-containing dentifrice, thus proving to be a better desensitizing agent.
Spreadsheets as tools for statistical computing and statistics education
Neuwirth, Erich
2000-01-01
Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.
Salmen, Saleh H; Alharbi, Sulaiman A; Faden, Asmaa A; Wainwright, M
2018-01-01
This study was aimed to evaluate the impact of high frequency electromagnetic fields (HF-EMF at 900 and 1800 MHz) on DNA, growth rate and antibiotic susceptibility of S. aureus , S. epidermidis , and P. aeruginosa . In this study, bacteria were exposed to 900 and 1800 MHz for 2 h and then inoculated to new medium when their growth rate and antibiotic susceptibility were evaluated. Results for the study of bacterial DNA unsuccessful to appearance any difference exposed and non-exposed S. aureus and S. epidermidis . Exposure of S. epidermidis and S. aureus to electromagnetic fields mostly produced no statistically significant decrease in bacterial growth, except for S. aureus when exposure to 900 MHz at 12 h. Exposure of P. aeruginosa to electromagnetic fields at 900 MHz however, lead to a significant reduction in growth rate, while 1800 MHz had insignificant effect. With the exception of S. aureus , treated with amoxicillin (30 µg) and exposed to electromagnetic fields, radiation treatment had no significant effect on bacterial sensitivity to antibiotics.
Register-based statistics statistical methods for administrative data
Wallgren, Anders
2014-01-01
This book provides a comprehensive and up to date treatment of theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Significant Statistics: Viewed with a Contextual Lens
Tait-McCutcheon, Sandi
2010-01-01
This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…
Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben
2017-09-15
Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Software Used to Generate Cancer Statistics - SEER Cancer Statistics
Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.
Statistical inference and Aristotle's Rhetoric.
Macdonald, Ranald R
2004-11-01
Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.
Replication unreliability in psychology: elusive phenomena or elusive statistical power?
Directory of Open Access Journals (Sweden)
Patrizio E Tressoldi
2012-07-01
Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.
Delay in Surgical Uptake for Cataract Services in a Pediatric ...
African Journals Online (AJOL)
tulyasys
(19%). These differences were statistically significant. (P = 0.027). Regarding laterality, out of 34, 27 (79%) children with bilateral cataracts were presented for surgery and that of 42, 29 (69%) children with unilateral cataracts were presented for the examination, however, the differences were statistically insignificant (P = 0.3) ...
Effect of tocotrienol on aortic atherosclerosis in diabetic mice
International Nuclear Information System (INIS)
Kiani, M.R.B.; Butt, S.A.; Ahmed, T.
2015-01-01
Effect of tocotrienol on aortic atherosclerosis in diabetic mice To study the histomorphological effect of tocotrienol on aortic atherosclerosis in diabetic mice having high fat diet. Study Design: Lab based randomized controlled trial. Place and Duration of Study: Army Medical College, Rawalpindi and National Institute of Health, Islamabad from November 2009 to June 2010. Material and Methods: Forty five female BALB/c mice were randomly divided into three groups. The diabetic mice model was established by intraperitoneal injection of streptozotocin (STZ) 40 mg/kg body weight. Group A was given normal laboratory diet, group B high fat diet and group C was given tocotrienol along with high fat diet for 32 weeks. At the end of experiment the mice were sacrificed. The hearts of animals were dissected out and ascending aortae were taken out. The specimen was fixed in 10% formol calcium and processed for paraffin embedding. Five micrometer thick sections were made for haematoxylin and eosin, and Verhoeff's staining. After staining, histomorphologic changes in slides were noted. Results: In contrast to group A, atheroscelrosis developed in groups B and C. Statistically significant atherosclerotic changes were found in the aortae of diabetic mice in group B when compared to group A. On comparison of group A to C, atherosclerotic changes were statistically insignificant. However when group B was compared with group C, the aortic atherosclerotic changes decreased significantly in group C. Conclusion: In diabetics with high fat diet intake, there is an increase in development of atherosclerosis in aorta which can be reduced by tocotrienol. (author)
Understanding Statistics and Statistics Education: A Chinese Perspective
Shi, Ning-Zhong; He, Xuming; Tao, Jian
2009-01-01
In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…
Statistics of ductile fracture surfaces: the effect of material parameters
DEFF Research Database (Denmark)
Ponson, Laurent; Cao, Yuanyuan; Bouchaud, Elisabeth
2013-01-01
distributed. The three dimensional analysis permits modeling of a three dimensional material microstructure and of the resulting three dimensional stress and deformation states that develop in the fracture process region. Material parameters characterizing void nucleation are varied and the statistics...... of the resulting fracture surfaces is investigated. All the fracture surfaces are found to be self-affine over a size range of about two orders of magnitude with a very similar roughness exponent of 0.56 ± 0.03. In contrast, the full statistics of the fracture surfaces is found to be more sensitive to the material...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Directory of Open Access Journals (Sweden)
Cheryl L L Carling
2009-08-01
Full Text Available BACKGROUND: While different ways of presenting treatment effects can affect health care decisions, little is known about which presentations best help people make decisions consistent with their own values. We compared six summary statistics for communicating coronary heart disease (CHD risk reduction with statins: relative risk reduction and five absolute summary measures-absolute risk reduction, number needed to treat, event rates, tablets needed to take, and natural frequencies. METHODS AND FINDINGS: We conducted a randomized trial to determine which presentation resulted in choices most consistent with participants' values. We recruited adult volunteers who participated through an interactive Web site. Participants rated the relative importance of outcomes using visual analogue scales (VAS. We then randomized participants to one of the six summary statistics and asked them to choose whether to take statins based on this information. We calculated a relative importance score (RIS by subtracting the VAS scores for the downsides of taking statins from the VAS score for CHD. We used logistic regression to determine the association between participants' RIS and their choice. 2,978 participants completed the study. Relative risk reduction resulted in a 21% higher probability of choosing to take statins over all values of RIS compared to the absolute summary statistics. This corresponds to a number needed to treat (NNT of 5; i.e., for every five participants shown the relative risk reduction one additional participant chose to take statins, compared to the other summary statistics. There were no significant differences among the absolute summary statistics in the association between RIS and participants' decisions whether to take statins. Natural frequencies were best understood (86% reported they understood them well or very well, and participants were most satisfied with this information. CONCLUSIONS: Presenting the benefits of taking statins as
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
International Nuclear Information System (INIS)
Satyanarayana Reddy, K.; Reddy, P.P.; Reddi, O.S.
1978-01-01
An investigation was taken up to screen the effects of 35 S on the prenatal development of mouse. Pregnant mice of CBA strain were injected intraperitoneally with a doze of 20 μCi of 35 S on 10.5 days of gestation and allowed to go to term. No mortality was observed in treated animals. However, a slight reduction in the number of fertile matings was noted in 35 S group. But the reduction was statistically insignificant. A significant decrease in litter size was noted in 35 S -treated group. While the litter size was 7.5/female in the control, it was 5.9/female in 35 S group. The reduced litter size might be due to 35 S-induced prenatal mortality. A further reduction in litter size was noted at weaning. This reduction was due to a significant increase in the neo- and postnatal mortality of F 1 progeny in the treated group. There was no effect of 35 S on the sex ratio and body weights of F 1 progeny. (auth.)
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
The thresholds for statistical and clinical significance
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...
The protective effect of DNA on the rat cell membrane damage induced by ultraviolet radiation
International Nuclear Information System (INIS)
Ma Shouxiang; Zhong Jinyan
1988-01-01
The protective effect of DNA on the cell membrane damage induced by ultra-violet radiation was studied. Rat erythrocytes were used as experimental materials. Blood samples were taken from the rat, and centrifuged to separate the plasma. The cells were washed twice with isotonic saline, resuspended in normal saline solution and then irradiated by ultra-violet radiation. The DNA was added before or after irradiation. THe cell suspensions were kept at 5 deg C for 20 hours after irradiation, and then centrifuged. The supernatants were used for hemoglobin determination. The main results obtained may summarized as follows: the cell suspension of erythrocytes were irradiated for 5, 10 and 20 min. The amount of hemolysis induced by irradiation dosage revealed a direct proportional relationship. If DNA (20-40μg/ml) was applied before irradiation, the amount of hemolysis induced apparently decreased. The differences between the control and DNA treated were statistically significant, P<0.01, but insignificant for DNA added after irradiation
Transportation statistics annual report, 2015
2016-01-01
The Transportation Statistics Annual Report : describes the Nations transportation system, : the systems performance, its contributions to : the economy, and its effects on people and the : environment. This 20th edition of the report is : base...
Transportation statistics annual report, 2013
2014-01-01
The Transportation Statistics Annual Report : describes the Nations transportation system, : the systems performance, its contributions to : the economy, and its effects on people and the : environment. This 18th edition of the report is : base...
READING STATISTICS AND RESEARCH
Directory of Open Access Journals (Sweden)
Reviewed by Yavuz Akbulut
2008-10-01
Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.
Statistical benchmark for BosonSampling
International Nuclear Information System (INIS)
Walschaers, Mattia; Mayer, Klaus; Buchleitner, Andreas; Kuipers, Jack; Urbina, Juan-Diego; Richter, Klaus; Tichy, Malte Christopher
2016-01-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church–Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects. (fast track communication)
The effect of sauna bathing on lipid profile in young, physically active, male subjects.
Gryka, Dorota; Pilch, Wanda; Szarek, Marta; Szygula, Zbigniew; Tota, Łukasz
2014-08-01
The aim of the study was to evaluate effects of Finnish sauna bathing on lipid profile in healthy, young men. Sixteen male subjects (20-23 years) were subjected to 10 sauna bathing sessions in a Finnish sauna every 1 or 2 days. The mean sauna temperature was 90±2°C, while humidity was 5-16%. Each session consisted of three 15-minute parts and a 2-minute cool-down between them. The following measurements were taken before and after the sauna sessions: body mass, heart rate, body skinfold thickness. The percentage fat content and then, the lean body mass were calculated. Total cholesterol, triacylglycerols, lipoprotein cholesterol LDL and HDL were measured in blood samples. A statistically significant decrease of total cholesterol and LDL cholesterol was observed during 3 weeks of sauna treatment and in the week afterwards. A significant decline in triacylglycerols was found directly after the 1st and 24 h directly after the 10th sauna session. After the 10th sauna session the level of HDL cholesterol remained slightly increased, but this change was not statistically significant. A decrease in blood plasma volume was found directly after the 1st and the last sauna bathing session due to perspiration. An adaptive increase in blood plasma volume was also found after the series of 10 sauna sessions. Ten complete sauna bathing sessions in a Finnish sauna caused a reduction in total cholesterol and LDL cholesterol fraction levels during the sessions and a gradual return of these levels to the initial level during the 1st and the 2nd week after the experiment. A small, statistically insignificant increase in HDL-C level and a transient decline in triacylglycerols were observed after those sauna sessions. The positive effect of sauna on lipid profile is similar to the effect that can be obtained through a moderate-intensity physical exercise.
Improving statistical reasoning theoretical models and practical implications
Sedlmeier, Peter
1999-01-01
This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.
Simple statistical methods for software engineering data and patterns
Pandian, C Ravindranath
2015-01-01
Although there are countless books on statistics, few are dedicated to the application of statistical methods to software engineering. Simple Statistical Methods for Software Engineering: Data and Patterns fills that void. Instead of delving into overly complex statistics, the book details simpler solutions that are just as effective and connect with the intuition of problem solvers.Sharing valuable insights into software engineering problems and solutions, the book not only explains the required statistical methods, but also provides many examples, review questions, and case studies that prov
Effect of the Target Motion Sampling Temperature Treatment Method on the Statistics and Performance
Viitanen, Tuomas; Leppänen, Jaakko
2014-06-01
Target Motion Sampling (TMS) is a stochastic on-the-fly temperature treatment technique that is being developed as a part of the Monte Carlo reactor physics code Serpent. The method provides for modeling of arbitrary temperatures in continuous-energy Monte Carlo tracking routines with only one set of cross sections stored in the computer memory. Previously, only the performance of the TMS method in terms of CPU time per transported neutron has been discussed. Since the effective cross sections are not calculated at any point of a transport simulation with TMS, reaction rate estimators must be scored using sampled cross sections, which is expected to increase the variances and, consequently, to decrease the figures-of-merit. This paper examines the effects of the TMS on the statistics and performance in practical calculations involving reaction rate estimation with collision estimators. Against all expectations it turned out that the usage of sampled response values has no practical effect on the performance of reaction rate estimators when using TMS with elevated basis cross section temperatures (EBT), i.e. the usual way. With 0 Kelvin cross sections a significant increase in the variances of capture rate estimators was observed right below the energy region of unresolved resonances, but at these energies the figures-of-merit could be increased using a simple resampling technique to decrease the variances of the responses. It was, however, noticed that the usage of the TMS method increases the statistical deviances of all estimators, including the flux estimator, by tens of percents in the vicinity of very strong resonances. This effect is actually not related to the usage of sampled responses, but is instead an inherent property of the TMS tracking method and concerns both EBT and 0 K calculations.
Statistical Inference at Work: Statistical Process Control as an Example
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
International Nuclear Information System (INIS)
Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong
2015-01-01
Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated
Role of Statistical Random-Effects Linear Models in Personalized Medicine.
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-03-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.
A Statistical Primer: Understanding Descriptive and Inferential Statistics
Gillian Byrne
2007-01-01
As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...
Solution of the statistical bootstrap with Bose statistics
International Nuclear Information System (INIS)
Engels, J.; Fabricius, K.; Schilling, K.
1977-01-01
A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density
Crowther, Lachlan; Shen, Gang; Almuzian, Mohammed; Jones, Allan; Walsh, William; Oliver, Rema; Petocz, Peter; Tarraf, Nour E; Darendeliler, M Ali
2017-10-01
To assess the potential effects of casein phosphopeptides (CPPs) on orthodontically induced iatrogenic root resorption (OIIRR) and orthodontic teeth movement. Forty Wistar rats (aged 11 weeks) were randomly divided into experimental group (EG; n = 20) that received a diet supplemented with CPP and control group (CG; n = 20) devoid of diet supplement. A 150 g force was applied using nickel titanium (NiTi) coil that was bonded on maxillary incisors and extended unilaterally to a maxillary first molar. At Day 28, animals in both groups were euthanized. Volumetric assessment of root resorption craters and linear measurement of maxillary first molars movement were blindly examined using a micro-computed tomography scan. Nine rats were excluded from the experiment due to loss during general anesthesia or appliances' failure. Intra-operator reproducibility was high in both volumetric and linear measurements, 92.8 per cent and 98.5-97.6 per cent, respectively. The results reveal that dietary CPP has statistically insignificant effect on the overall OIIRR and orthodontic movement. CPP seems to have statistically insignificant effect on the volume of OIIRR and orthodontic movement in rats. A long-term study with larger sample size using a different concentration of CPP is required to clarify the dentoalveolar effect of CPP. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com
Insignificant acute toxicity of TiO2 nanoparticles to willow trees
DEFF Research Database (Denmark)
Seeger, Eva Mareike; Baun, Anders; Kästner, M.
2009-01-01
Manufactured nanoparticles (MNP) are expected to increase in production in near future. In response, their environmental fate and effects are intensively studied. Phytotoxicity of some types of nanoparticles has been observed for annual species in the seed germination and root elongation test. Ye...
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Transportation Statistics Annual Report, 2017
2018-01-01
The Transportation Statistics Annual Report describes the Nations transportation system, the systems performance, its contributions to the economy, and its effects on people and the environment. This 22nd edition of the report is based on infor...
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual
All of statistics a concise course in statistical inference
Wasserman, Larry
2004-01-01
This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
A perceptual space of local image statistics.
Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M
2015-12-01
Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.
Soil-structure interaction effects on containment fragilities and floor response spectra statistics
International Nuclear Information System (INIS)
Pires, J.; Reich, M.; Chokshi, N.C.
1987-01-01
The probability-based method for the reliability evaluation of nuclear structures developed at Brookhaven National Laboratory (BNL) is extended to include soil-structure interaction effects. A reinforced concrete containment is analyzed in order to investigate the soil-structure interaction effects on: structural fragilities; floor response spectra statistics and acceleration response correlations. To include the effect of soil flexibility on the reliability assessment the following two step approach is used. In the first step, the lumped parameter method for soil-structure interaction analysis is used together with a stick model representation of the structure in order to obtain the motions of the foundation plate. These motions, which include both translations and rotations of the foundation plate, are expressed in terms of the power-spectral density of the free-field ground excitation and the transfer function of the total acceleration response of the foundation. The second step involves a detailed finite element model of the structure subjected to the interaction motions computed from step one. Making use of the structural model and interaction motion the reliability analysis method yields the limit stat probabilities and fragility data for the structure
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P
1999-01-01
Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149
Vindras, Philippe; Desmurget, Michel; Baraduc, Pierre
2012-01-01
In science, it is a common experience to discover that although the investigated effect is very clear in some individuals, statistical tests are not significant because the effect is null or even opposite in other individuals. Indeed, t-tests, Anovas and linear regressions compare the average effect with respect to its inter-individual variability, so that they can fail to evidence a factor that has a high effect in many individuals (with respect to the intra-individual variability). In such paradoxical situations, statistical tools are at odds with the researcher's aim to uncover any factor that affects individual behavior, and not only those with stereotypical effects. In order to go beyond the reductive and sometimes illusory description of the average behavior, we propose a simple statistical method: applying a Kolmogorov-Smirnov test to assess whether the distribution of p-values provided by individual tests is significantly biased towards zero. Using Monte-Carlo studies, we assess the power of this two-step procedure with respect to RM Anova and multilevel mixed-effect analyses, and probe its robustness when individual data violate the assumption of normality and homoscedasticity. We find that the method is powerful and robust even with small sample sizes for which multilevel methods reach their limits. In contrast to existing methods for combining p-values, the Kolmogorov-Smirnov test has unique resistance to outlier individuals: it cannot yield significance based on a high effect in one or two exceptional individuals, which allows drawing valid population inferences. The simplicity and ease of use of our method facilitates the identification of factors that would otherwise be overlooked because they affect individual behavior in significant but variable ways, and its power and reliability with small sample sizes (<30-50 individuals) suggest it as a tool of choice in exploratory studies.
Directory of Open Access Journals (Sweden)
Philippe Vindras
Full Text Available In science, it is a common experience to discover that although the investigated effect is very clear in some individuals, statistical tests are not significant because the effect is null or even opposite in other individuals. Indeed, t-tests, Anovas and linear regressions compare the average effect with respect to its inter-individual variability, so that they can fail to evidence a factor that has a high effect in many individuals (with respect to the intra-individual variability. In such paradoxical situations, statistical tools are at odds with the researcher's aim to uncover any factor that affects individual behavior, and not only those with stereotypical effects. In order to go beyond the reductive and sometimes illusory description of the average behavior, we propose a simple statistical method: applying a Kolmogorov-Smirnov test to assess whether the distribution of p-values provided by individual tests is significantly biased towards zero. Using Monte-Carlo studies, we assess the power of this two-step procedure with respect to RM Anova and multilevel mixed-effect analyses, and probe its robustness when individual data violate the assumption of normality and homoscedasticity. We find that the method is powerful and robust even with small sample sizes for which multilevel methods reach their limits. In contrast to existing methods for combining p-values, the Kolmogorov-Smirnov test has unique resistance to outlier individuals: it cannot yield significance based on a high effect in one or two exceptional individuals, which allows drawing valid population inferences. The simplicity and ease of use of our method facilitates the identification of factors that would otherwise be overlooked because they affect individual behavior in significant but variable ways, and its power and reliability with small sample sizes (<30-50 individuals suggest it as a tool of choice in exploratory studies.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
On the statistical properties of photons
International Nuclear Information System (INIS)
Cini, M.
1990-01-01
The interpretation in terms of a transition from Maxwell-Boltzmann to Bose-Einstein statistics of the effect in quantum optics of degenerate light discovered by De Martini and Di Fonzo is discussed. It is shown that the results of the experiment can be explained by using only the quantum-mechanical rule that the states of an assembly of bosons should be completely symmetrical, without mentioning in any way their statistical properties. This means that photons are indeed identical particles
Multinational Firms and Business Cycle Transmission
DEFF Research Database (Denmark)
Menno, Dominik Francesco
This paper studies the effect of foreign direct investment (FDI) on the transmission of international business cycles. I document for the G7 countries between 1991 and 2006 that increases in bilateral FDI linkages are associated with more synchronized investment cycles. I also find...... that the relation between FDI integration and synchronization of gross domestic product (GDP) is - yet positive - statistically insignificant after controlling for time fixed effects. I then study a model of international business cycles with an essential role for FDI and shocks to multinational activity...
Statistical properties of superimposed stationary spike trains.
Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan
2012-06-01
The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.
Statistical health-effects study
International Nuclear Information System (INIS)
Gilbert, E.S.
1982-01-01
The main purpose of this program is to analyze the mortality of Hanford workers and to determine the effects of radiation exposure in this population. A secondary purpose is to improve methodology for assessing health effects of chronic low-level exposure to harmful agents or substances, particularly in an occupational setting. In the past year we have updated our analyses, submitted papers for publication in the two areas of methodological research, and have interacted with Hanford Environmental Health Foundation staff to improve data collection procedures
Securing wide appreciation of health statistics.
PYRRAIT A M DO, A; AUBENQUE, M J; BENJAMIN, B; DE GROOT, M J; KOHN, R
1954-01-01
All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the "consumers". At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why.There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians.
Environmental radiations and childhood dynamic statistics
International Nuclear Information System (INIS)
Sakka, Masatoshi
1981-01-01
In Fukushima prefecture the first nuclear power plant attained criticality in 1971. Since then 6 reactors have been in operation. Increasing concern is the possible adverse effects due to ionizing radiations released from nuclear reactors. As the radiation level around the nuclear power plants is usually low, induced effects are necessarily delayed ones which require tens of years to appear. Among other tissues, embryos and foetuses are most radiosensitive and induced effects result in the change of childhood dynamic statistics. In this report dynamic statistics including stillbirth, perinatal death, neonatal death, infant death, 3rd year examinations were surveyed in 18 health centers in the prefecture from 1961 to 1979. Environmental radiation levels in each district (health centers) were compared and were arranged in order, 1, 2, ... etc. Dynamic statistics were also compared for each district and were arranged in order. Order correlation coefficients were calculated and a linearity between radiation level and health status was tested. No significant values were obtained ranging from 0.66 to -0.43 of correlation coefficients. Still birth decreased 4.4%/y since 1963 and neonatal death decreased 6.7%/y and infant death also decreased 8.7%/y since 1957 on an average. These decreases were negatively correlated with the proliferation of water supply service, sewage service and increase of physicians in 18 districts including 2 which are under continuous observation of environmental radiations released from nuclear power plants. Childhood dynamic statistics have been turning better in the last 10 years in prefecture with the difference of 47 mR/y (lowest values of 56 mR/y on an average in 3 prefectures and highest of 103 mR/y in 4 ones). Environmental radiation may initiate adverse effects on prenatal lives but the hygienic improvement in recent years must extinguish the promotion of the adverse effects. This may be a plausible explanation. (author)
Effect-independent measures of tissue responses to fractionated irradiation
International Nuclear Information System (INIS)
Thames, H.D. Jr.
1984-01-01
Tissue repair factors measure the sparing that can be achieved from dose fractionation in the absence of proliferation. Four repair factors are analysed in these terms: Fsub(R),Fsub(rec), the ratio of linear-quadratic survival model parameters β/α and the half-time Tsub(1/2) for intracellular repair processes. Theoretically, Fsub(R) and Fsub(rec) are increasing functions of D 1 , and thus depend on level of effect. This is confirmed by analysis of skin reactions after multifractionated radiation. By contrast, β/α is effect-independent as a measure of repair capacity in skin, gut, and bone marrow, tissues for which it is reasonable to assume that survival of identifiable target cells is the primary determinant of the endpoint. For a functional endpoint not clearly connected with the depletion of a specific target-cell population (late fibrotic reactions in the kidney), there was an increase in β/α with increased levels of injury, but this was statistically insignificant. Tsub(1/2) is independent of fraction size in skin, gut, and spinal cord, and is longer (1.5 hours) in the late-reacting tissues (lung and spinal cord) than in those that react acutely (Tsub(1/2) less than 1 hour), with skin as the exception (Tsub(1/2) approx. 1.3 hours). (author)
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
The statistical bandwidth of Butterworth filters
Davy, J. L.; Dunn, I. P.
1987-06-01
The precision of standard architectural acoustic measurements is a function of the statistical bandwidth of the band pass filters used in the measurements. The International and United States Standards on octave and fractional octave-band filters which specify the band pass filters used in architectural acoustics measurements give the effective bandwidth, but unfortunately not the statistical bandwidth of the filters. Both these Standards are currently being revised and both revisions require the use of Butterworth filter characteristics. In this paper it is shown theoretically that the ratio of statistical bandwidth to effective bandwidth for an nth order Butterworth band pass filter is {2n}/{(2n-1)}. This is verified experimentally for third-octave third-order Butterworth band pass filters. It is also shown experimentally that this formula is approximately correct for some non-Butterworth third-octave third-order band pass filters. Because of the importance of Butterworth filters in the revised Standards, the theory of Butterworth filters is reviewed and the formulae for Butterworth filters given in both revised Standards are derived.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
International Nuclear Information System (INIS)
Rosa, B.; Parishani, H.; Ayala, O.; Wang, L.-P.
2015-01-01
In this paper, we study systematically the effects of forcing time scale in the large-scale stochastic forcing scheme of Eswaran and Pope [“An examination of forcing in direct numerical simulations of turbulence,” Comput. Fluids 16, 257 (1988)] on the simulated flow structures and statistics of forced turbulence. Using direct numerical simulations, we find that the forcing time scale affects the flow dissipation rate and flow Reynolds number. Other flow statistics can be predicted using the altered flow dissipation rate and flow Reynolds number, except when the forcing time scale is made unrealistically large to yield a Taylor microscale flow Reynolds number of 30 and less. We then study the effects of forcing time scale on the kinematic collision statistics of inertial particles. We show that the radial distribution function and the radial relative velocity may depend on the forcing time scale when it becomes comparable to the eddy turnover time. This dependence, however, can be largely explained in terms of altered flow Reynolds number and the changing range of flow length scales present in the turbulent flow. We argue that removing this dependence is important when studying the Reynolds number dependence of the turbulent collision statistics. The results are also compared to those based on a deterministic forcing scheme to better understand the role of large-scale forcing, relative to that of the small-scale turbulence, on turbulent collision of inertial particles. To further elucidate the correlation between the altered flow structures and dynamics of inertial particles, a conditional analysis has been performed, showing that the regions of higher collision rate of inertial particles are well correlated with the regions of lower vorticity. Regions of higher concentration of pairs at contact are found to be highly correlated with the region of high energy dissipation rate
Generalized statistics and the formation of a quark-gluon plasma
International Nuclear Information System (INIS)
Teweldeberhan, A.M.; Miller, H.G.; Tegen, R.
2003-01-01
The aim of this paper is to investigate the effect of a non-extensive form of statistical mechanics proposed by Tsallis on the formation of a quark-gluon plasma (QGP). We suggest to account for the effects of the dominant part of the long-range interactions among the constituents in the QGP by a change in the statistics of the system in this phase, and we study the relevance of this statistics for the phase transition. The results show that small deviations (≈ 10%) from Boltzmann–Gibbs statistics in the QGP produce a noticeable change in the phase diagram, which can, in principle, be tested experimentally. (author)
Application of blended learning in teaching statistical methods
Directory of Open Access Journals (Sweden)
Barbara Dębska
2012-12-01
Full Text Available The paper presents the application of a hybrid method (blended learning - linking traditional education with on-line education to teach selected problems of mathematical statistics. This includes the teaching of the application of mathematical statistics to evaluate laboratory experimental results. An on-line statistics course was developed to form an integral part of the module ‘methods of statistical evaluation of experimental results’. The course complies with the principles outlined in the Polish National Framework of Qualifications with respect to the scope of knowledge, skills and competencies that students should have acquired at course completion. The paper presents the structure of the course and the educational content provided through multimedia lessons made accessible on the Moodle platform. Following courses which used the traditional method of teaching and courses which used the hybrid method of teaching, students test results were compared and discussed to evaluate the effectiveness of the hybrid method of teaching when compared to the effectiveness of the traditional method of teaching.
Evaluation of radiographic imaging techniques in lung nodule detection
International Nuclear Information System (INIS)
Ho, J.T.; Kruger, R.A.
1989-01-01
Dual-energy radiography appears to be the most effective technique to address bone superposition that compromises conventional chest radiography. A dual-energy, single-exposure, film-based technique was compared with a dual-energy, dual-exposure technique and conventional chest radiography in a simulated lung nodule detection study. Observers detected more nodules on images produced by dual-energy techniques than on images produced by conventional chest radiography. The difference between dual-energy and conventional chest radiography is statistically significant and the difference between dual-energy, dual-exposure and single-exposure techniques is statistically insignificant. The single-exposure technique has the potential to replace the dual-exposure technique in future clinical application
2013-08-14
... a level of emission reductions resulting from fixing a statistically insignificant number of old... economic impact on a substantial number of small entities under the Regulatory Flexibility Act (5 U.S.C...
Whose statistical reasoning is facilitated by a causal structure intervention?
McNair, Simon; Feeney, Aidan
2015-02-01
People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
The MLC tongue-and-groove effect on IMRT dose distributions
Energy Technology Data Exchange (ETDEWEB)
Deng Jun [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305 (United States). E-mail: jun@reyes.stanford.edu; Pawlicki, Todd; Chen Yan; Li Jinsheng; Jiang, Steve B.; Ma, C.-M. [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305 (United States)
2001-04-01
We have investigated the tongue-and-groove effect on the IMRT dose distributions for a Varian MLC. We have compared the dose distributions calculated using the intensity maps with and without the tongue-and-groove effect. Our results showed that, for one intensity-modulated treatment field, the maximum tongue-and-groove effect could be up to 10% of the maximum dose in the dose distributions. For an IMRT treatment with multiple gantry angles ({>=} 5), the difference between the dose distributions with and without the tongue-and-groove effect was hardly visible, less than 1.6% for the two typical clinical cases studied. After considering the patient setup errors, the dose distributions were smoothed with reduced and insignificant differences between plans with and without the tongue-and-groove effect. Therefore, for a multiple-field IMRT plan ({>=} 5), the tongue-and-groove effect on the IMRT dose distributions will be generally clinically insignificant due to the smearing effect of individual fields. The tongue-and-groove effect on an IMRT plan with small number of fields (<5) will vary depending on the number of fields in a plan (coplanar or non-coplanar), the MLC leaf sequences and the patient setup uncertainty, and may be significant (>5% of maximum dose) in some cases, especially when the patient setup uncertainty is small ({<=} 2 mm). (author)
Madhavan, Ranjith; George, Navia; Thummala, Niharika R; Ravi, S V; Nagpal, Ajay
2017-11-01
For the construction of any dental prosthesis, accurate impressions are necessary. Hence, we undertook the present study to evaluate and compare the surface hardness of gypsum casts poured from impressions made using conventional alginate and self-disinfecting alginate. A total of 30 impressions of stainless steel die were made, out of which 15 impressions were made with conventional alginate and 15 were made with self-disinfecting alginate and poured using Type III dental stone. Thirty stone specimens were subjected for hardness testing. Data were analyzed using independent samples t-test to compare the mean surface hardness. Difference in surface hardness was statistically insignificant (p > 0.05). Surface hardness of gypsum casts poured using impressions made from self-disinfecting alginate and conventional alginates were comparable. Self-disinfecting alginates may be employed in clinical practice as safe and effective materials to overcome the infection control issues without compromising on the properties of the material.
Effects of different building blocks designs on the statistical ...
African Journals Online (AJOL)
Tholang T. Mokhele
Enumeration Areas (EAs), Small Area Layers (SALs) and SubPlaces) from the 2001 census data were used as building blocks for the generation of census output areas with AZTool program in both rural and urban areas of South Africa. One way-Analysis of Variance (ANOVA) was also performed to determine statistical ...
Statistical health-effects study
International Nuclear Information System (INIS)
Gilbert, E.S.; Sever, L.E.
1983-01-01
A principal objective of this program is to determine if there are demonstrable effects of radiation exposure to the Hanford worker by analyzing mortality records of this population. A secondary purpose is to improve methodology for assessing health effects of chronic low-level exposure to harmful agents or substances, particularly i an occupational setting. In the past year we have updated our analyses and initiated new areas of analysis. Complete documentation was provided for our computer program for the mortality study, and a user's manual is under development. A case-control study of birth defects was started in FY 1982
Statistics for Learning Genetics
Charles, Abigail Sheena
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless
Should we expect financial globalization to have significant effects on business cycles?
Iversen, Jens
2009-01-01
Empirical research suggests that financial globalization has insignificant effects on business cycles. Based on standard theoretical models it might be conjectured that the effects should be significant. I show that this conjecture is wrong. Theoretical effects of financial globalization can be determined to any level of precision by expanding the underlying artificial samples. In contrast, in the data the effects are imprecisely estimated because of short samples. I show that if the conclusi...
Powdthavee, Nattavudh; Vernoit, James
2014-01-01
Using a unique longitudinal data of British youths we estimate how adolescents' overall happiness is related to parents' exposure to unemployment. Our within-child estimates suggest that parental job loss when the child was relatively young has a positive influence on children's overall happiness. However, this positive association became either strongly negative or statistically insignificant as the child grew older. The estimated effects of parental job loss on children's happiness also appear to be unrelated to its effect on family income, parent–child interaction, and children's school experience. Together these findings offer new psychological evidence of unemployment effects on children's livelihood. PMID:24932068
Marsh, Herbert W.; Kuyper, Hans; Morin, Alexandre J. S.; Philip, D. Parker; Seaton, Marjorie
2014-01-01
We offer new theoretical, substantive, statistical, design, and methodological insights into the seemingly paradoxical negative effects of school- and class-average achievement (ACH) on academic self-concept (ASC) the big-fish-little-pond-effect (BFLPE; 15,356 Dutch 9th grade students from 651
Perceptual statistical learning over one week in child speech production.
Richtsmeier, Peter T; Goffman, Lisa
2017-07-01
What cognitive mechanisms account for the trajectory of speech sound development, in particular, gradually increasing accuracy during childhood? An intriguing potential contributor is statistical learning, a type of learning that has been studied frequently in infant perception but less often in child speech production. To assess the relevance of statistical learning to developing speech accuracy, we carried out a statistical learning experiment with four- and five-year-olds in which statistical learning was examined over one week. Children were familiarized with and tested on word-medial consonant sequences in novel words. There was only modest evidence for statistical learning, primarily in the first few productions of the first session. This initial learning effect nevertheless aligns with previous statistical learning research. Furthermore, the overall learning effect was similar to an estimate of weekly accuracy growth based on normative studies. The results implicate other important factors in speech sound development, particularly learning via production. Copyright © 2017 Elsevier Inc. All rights reserved.
Photon statistics, antibunching and squeezed states
International Nuclear Information System (INIS)
Leuchs, G.
1986-01-01
This paper attempts to describe the status and addresses future prospects of experiments regarding photon antibunching, and squeezed states. Light correlation is presented in the framework of classical electrodynamics. The extension to quantized radiation fields is discussed and an introduction to the basic principles related to photon statistics, antibunching and squeezed states are presented. The effect of linear attenuation (beam splitters, neutral density filters, and detector quantum efficiency) on the detected signal is discussed. Experiments on the change of photon statistics by the nonlinear interaction of radiation fields with matter are described and some experimental observations of antibunching and sub-Poissonian photon statistics in resonance fluorescence and with possible schemes for the generation and detection of squeezed states are examined
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
[Cause-of-death statistics and ICD, quo vadis?
Eckert, Olaf; Vogel, Ulrich
2018-07-01
The International Statistical Classification of Diseases and Related Health Problems (ICD) is the worldwide binding standard for generating underlying cause-of-death statistics. What are the effects of former revisions of the ICD on underlying cause-of-death statistics and which opportunities and challenges are becoming apparent in a possible transition process from ICD-10 to ICD-11?This article presents the calculation of the exploitation grade of ICD-9 and ICD-10 in the German cause-of-death statistics and quality of documentation. Approximately 67,000 anonymized German death certificates are processed by Iris/MUSE and official German cause-of-death statistics are analyzed.In addition to substantial changes in the exploitation grade in the transition from ICD-9 to ICD-10, regional effects become visible. The rate of so-called "ill-defined" conditions exceeds 10%.Despite substantial improvement of ICD revisions there are long-known deficits in the coroner's inquest, filling death certificates and quality of coding. To make better use of the ICD as a methodological framework for mortality statistics and health reporting in Germany, the following measures are necessary: 1. General use of Iris/MUSE, 2. Establishing multiple underlying cause-of-death statistics, 3. Introduction of an electronic death certificate, 4. Improvement of the medical assessment of cause of death.Within short time the WHO will release the 11th revision of the ICD that will provide additional opportunities for the development of underlying cause-of-death statistics and their use in science, public health and politics. A coordinated effort including participants in the process and users is necessary to meet the related challenges.
PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
International Nuclear Information System (INIS)
2013-01-01
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Statistical density of nuclear excited states
Directory of Open Access Journals (Sweden)
V. M. Kolomietz
2015-10-01
Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.
The use and misuse of statistical methodologies in pharmacology research.
Marino, Michael J
2014-01-01
Descriptive, exploratory, and inferential statistics are necessary components of hypothesis-driven biomedical research. Despite the ubiquitous need for these tools, the emphasis on statistical methods in pharmacology has become dominated by inferential methods often chosen more by the availability of user-friendly software than by any understanding of the data set or the critical assumptions of the statistical tests. Such frank misuse of statistical methodology and the quest to reach the mystical αstatistical training. Perhaps more critically, a poor understanding of statistical tools limits the conclusions that may be drawn from a study by divorcing the investigator from their own data. The net result is a decrease in quality and confidence in research findings, fueling recent controversies over the reproducibility of high profile findings and effects that appear to diminish over time. The recent development of "omics" approaches leading to the production of massive higher dimensional data sets has amplified these issues making it clear that new approaches are needed to appropriately and effectively mine this type of data. Unfortunately, statistical education in the field has not kept pace. This commentary provides a foundation for an intuitive understanding of statistics that fosters an exploratory approach and an appreciation for the assumptions of various statistical tests that hopefully will increase the correct use of statistics, the application of exploratory data analysis, and the use of statistical study design, with the goal of increasing reproducibility and confidence in the literature. Copyright © 2013. Published by Elsevier Inc.
Fernández, Leandro; Monbaliu, Jaak; Onorato, Miguel; Toffoli, Alessandro
2014-05-01
This research is focused on the study of nonlinear evolution of irregular wave fields in water of arbitrary depth by comparing field measurements and numerical simulations.It is now well accepted that modulational instability, known as one of the main mechanisms for the formation of rogue waves, induces strong departures from Gaussian statistics. However, whereas non-Gaussian properties are remarkable when wave fields follow one direction of propagation over an infinite water depth, wave statistics only weakly deviate from Gaussianity when waves spread over a range of different directions. Over finite water depth, furthermore, wave instability attenuates overall and eventually vanishes for relative water depths as low as kh=1.36 (where k is the wavenumber of the dominant waves and h the water depth). Recent experimental results, nonetheless, seem to indicate that oblique perturbations are capable of triggering and sustaining modulational instability even if khthe aim of this research is to understand whether the combined effect of directionality and finite water depth has a significant effect on wave statistics and particularly on the occurrence of extremes. For this purpose, numerical experiments have been performed solving the Euler equation of motion with the Higher Order Spectral Method (HOSM) and compared with data of short crested wave fields for different sea states observed at the Lake George (Australia). A comparative analysis of the statistical properties (i.e. density function of the surface elevation and its statistical moments skewness and kurtosis) between simulations and in-situ data provides a confrontation between the numerical developments and real observations in field conditions.
Designing a Course in Statistics for a Learning Health Systems Training Program
Samsa, Gregory P.; LeBlanc, Thomas W.; Zaas, Aimee; Howie, Lynn; Abernethy, Amy P.
2014-01-01
The core pedagogic problem considered here is how to effectively teach statistics to physicians who are engaged in a "learning health system" (LHS). This is a special case of a broader issue--namely, how to effectively teach statistics to academic physicians for whom research--and thus statistics--is a requirement for professional…
Introductory statistical mechanics for electron storage rings
International Nuclear Information System (INIS)
Jowett, J.M.
1986-07-01
These lectures introduce the beam dynamics of electron-positron storage rings with particular emphasis on the effects due to synchrotron radiation. They differ from most other introductions in their systematic use of the physical principles and mathematical techniques of the non-equilibrium statistical mechanics of fluctuating dynamical systems. A self-contained exposition of the necessary topics from this field is included. Throughout the development, a Hamiltonian description of the effects of the externally applied fields is maintained in order to preserve the links with other lectures on beam dynamics and to show clearly the extent to which electron dynamics in non-Hamiltonian. The statistical mechanical framework is extended to a discussion of the conceptual foundations of the treatment of collective effects through the Vlasov equation
Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri
2014-01-01
Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Non-statistical effects in bond fission reactions of 1,2-difluoroethane
Schranz, Harold W.; Raff, Lionel M.; Thompson, Donald L.
1991-08-01
A microcanonical, classical variational transition-state theory based on the use of the efficient microcanonical sampling (EMS) procedure is applied to simple bond fission in 1,2-difluoroethane. Comparison is made with results of trajectory calculations performed on the same global potential-energy surface. Agreement between the statistical theory and trajectory results for CC CF and CH bond fissions is poor with differences as large as a factor of 125. Most importantly, at the lower energy studied, 6.0 eV, the statistical calculations predict considerably slower rates than those computed from trajectories. We conclude from these results that the statistical assumptions inherent in the transition-state theory method are not valid for 1,2-difluoroethane in spite of the fact that the total intramolecular energy transfer rate out of CH and CC normal and local modes is large relative to the bond fission rates. The IVR rate is not globally rapid and the trajectories do not access all of the energetically available phase space uniformly on the timescale of the reactions.
International Nuclear Information System (INIS)
Antila, H.M.J.; Salo, M.S.; Kirvelae, O.; Nikkanen, V.
1992-01-01
Mononuclear (MNC) and polymorphonuclear cell (PMNC) zinc content was determined together with serum zinc, copper, selenium and iron concentrations in 24 operable breast cancer patients during and after postoperative radiotherapy. Anthropometric and biochemical indices of nutritional status were measured as background data. The measurements were carried out in the years 1987-1988. Nine patients used unconventional multivitamin or trace element preparations. A steady but statistically insignificant decrease in PMNC zinc was seen during treatment. No changes occurred in MNC zinc. Serum copper levels increased in five patients possibly due to tamoxifen treatment, but no other alterations occurred in serum trace element levels. Appetite was well maintained and nutritional status remained unaltered. Postoperative radiotherapy for breast carcinoma had thus no effect on either trace element or nutritional status. Patient-initiated alternative treatments did not significantly affect their trace element levels. This was probably due to small supplementation doses or irregular use of the preparations. (orig.)
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Assessing management effects on Oak forests in Austria
Gautam, Sishir; Pietsch, Stephan A.; Hasenauer, Hubert
2010-05-01
Historic land use as well as silvicultural management practices have changed the structures and species composition of central European forests. Such changes have effects on the growth of forests and contribute to global warming. As insufficient information on historic forest management is available it is hard to explain the effect of management on forests growth and its possible consequences to the environment. In this situation, the BIOME-BGC model, which integrates the main physical, biological and physiological processes based on current understanding of ecophysiology is an option for assessing the management effects through tracking the cycling of energy, water, carbon and nutrients within a given ecosystems. Such models are increasingly employed to simulate current and future forest dynamics. This study first compares observed standing tree volume, carbon and nitrogen content in soil in the high forests and coppice with standards stands of Oak forests in Austria. Biome BGC is then used to assess the effects of management on forest growth and to explain the differences with measured parameters. Close positive correlations and unbiased results and statistically insignificant differences between predicted and observed volumes indicates the application of the model as a diagnostic tool to assess management effects in oak forests. The observed data in 2006 and 2009 was further compared with the results of respective model runs. Further analysis on simulated data shows that thinning leads to an increase in growth efficiency (GE), nitrogen use efficiency (NUE) and water use efficiency (WUE), and to a decrease in the radiation use efficiency (RUE) in both forests. Among all studied growth parameters, only the difference in the NUE was statistically significant. This indicates that the difference in the yield of forests is mainly governed by the NUE difference in stands due to thinning. The coppice with standards system produces an equal amount of net primary
Cantiello, Francesco; Russo, Giorgio Ivan; Cicione, Antonio; Ferro, Matteo; Cimino, Sebastiano; Favilla, Vincenzo; Perdonà, Sisto; De Cobelli, Ottavio; Magno, Carlo; Morgia, Giuseppe; Damiano, Rocco
2016-04-01
To assess the performance of prostate health index (PHI) and prostate cancer antigen 3 (PCA3) when added to the PRIAS or Epstein criteria in predicting the presence of pathologically insignificant prostate cancer (IPCa) in patients who underwent radical prostatectomy (RP) but eligible for active surveillance (AS). An observational retrospective study was performed in 188 PCa patients treated with laparoscopic or robot-assisted RP but eligible for AS according to Epstein or PRIAS criteria. Blood and urinary specimens were collected before initial prostate biopsy for PHI and PCA3 measurements. Multivariate logistic regression analyses and decision curve analysis were carried out to identify predictors of IPCa using the updated ERSPC definition. At the multivariate analyses, the inclusion of both PCA3 and PHI significantly increased the accuracy of the Epstein multivariate model in predicting IPCa with an increase of 17 % (AUC = 0.77) and of 32 % (AUC = 0.92), respectively. The inclusion of both PCA3 and PHI also increased the predictive accuracy of the PRIAS multivariate model with an increase of 29 % (AUC = 0.87) and of 39 % (AUC = 0.97), respectively. DCA revealed that the multivariable models with the addition of PHI or PCA3 showed a greater net benefit and performed better than the reference models. In a direct comparison, PHI outperformed PCA3 performance resulting in higher net benefit. In a same cohort of patients eligible for AS, the addition of PHI and PCA3 to Epstein or PRIAS models improved their prognostic performance. PHI resulted in greater net benefit in predicting IPCa compared to PCA3.
Study of serum lipids in leprosy
Directory of Open Access Journals (Sweden)
Gupta Anju
2002-01-01
Full Text Available Fifty fresh and untreated patients of leprosy constituted the study group. Fifty, age and sex matched healthy individuals formed the controls. Ridly and Jopling system of classification was used in the study. Majority i.e 21 cases were of BT group, 12 of BB, 7 of BL, 9 of LL and one case was of TT leprosy. The serum triglyceride level was lower than normal in TT, showed no alteration in BT or BB and was insignificantly increased in bL and LL patients. The total cholesterol was lowerthan normal in TT, showed no alteration in BT or BB and was insignificantly increased in Bland LL patients. The total cholesterol was lower than normal in TT, whereas in BT, BB, BL and LL groups the levels were statistically decreased. The HDL cholesterol was within normal range in TT, significantly decreased in BT and LL patients, showed no significant alteration in BB and was insignificantly decreased in BL group. The LDL cholesterol in TT was low but was not so low statistically when compared with the controls, whereas in BT, BB, BL and LL groups the levels were statistically decreased. The VLDL cholesterol was within normal range in TT and BT, was raised insignificantly in 3 of 12 cases of BB, was within normal range in BL and in LL leprosy it was raised in one out of 9 cases. In the absence of any derangement of liver function tests, it can be concluded that leprosy per se leads to alterations in lipid metabolism. However, no correlation could be established between the group/type of leprosy, bacterial indices and levels of different lipid fractions in the present study.
A Nineteenth Century Statistical Society that Abandoned Statistics
Stamhuis, I.H.
2007-01-01
In 1857, a Statistical Society was founded in the Netherlands. Within this society, statistics was considered a systematic, quantitative, and qualitative description of society. In the course of time, the society attracted a wide and diverse membership, although the number of physicians on its rolls
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Directory of Open Access Journals (Sweden)
Si-Jun Park
2015-01-01
Full Text Available The effects of alloying elements (Co, Cr, Mo, W, Al, Ti, and Ta on the oxidation resistance of Ni-based superalloys are studied using the Response Surface Methodology (RSM. The statistical analysis showed that Al and Ta generally improve the oxidation resistance of the alloy, whereas Ti and Mo degrade the oxidation resistance. Co, Cr, and W did not alter oxidation rate significantly when examined by the mass gain averaged for all model alloys. However, it is remarkable that the degree of the effects of alloying elements varied with the concentration of other elements. Further, the effect of each element was sometimes found to be reversed for alloy groups specified by the concentration of another element.
Colon-Berlingeri, Migdalisel; Burrowes, Patricia A
2011-01-01
Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.
A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism.
Levman, Jacob; Takahashi, Emi; Forgeron, Cynthia; MacDonald, Patrick; Stewart, Natalie; Lim, Ashley; Martel, Anne
2018-01-01
Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital , Harvard Medical School , demonstrating that it can potentially play a constructive role in future healthcare technologies.
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
Directory of Open Access Journals (Sweden)
D.P. van der Nest
2015-03-01
Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items
Did the US macroeconomic conditions affect Asian stock markets?
Seema Narayan; Paresh Kumar Narayan
2011-01-01
The aim of this paper is to examine the impact of US macroeconomic conditions—namely, exchange rate and short-term interest rate—on the stocks of seven Asian countries (China,India, the Philippines, Malaysia, Singapore, Thailand, and South Korea). Using daily data for the period 2000 to 2010, we divide the sample into pre-crisis period (pre-August 2007) and crisis period (post-August 2007) we find that in the short-run interest rate has a statistically insignificant effect on returns for all ...
Evaluation of Recurring Esthetic Dental Proportion in Natural Mandibular Anterior Dentition
Directory of Open Access Journals (Sweden)
Dipti S Shah
2015-01-01
Results: After calculating proportions in mandibular anterior teeth, P value was found to be statistically insignificant (P > 0.05. Conclusion: Within the limitations of the study, RED proportion was not seen in mandibular natural dentition.
Integer Set Compression and Statistical Modeling
DEFF Research Database (Denmark)
Larsson, N. Jesper
2014-01-01
enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities......Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...
Optimal state discrimination using particle statistics
International Nuclear Information System (INIS)
Bose, S.; Ekert, A.; Omar, Y.; Paunkovic, N.; Vedral, V.
2003-01-01
We present an application of particle statistics to the problem of optimal ambiguous discrimination of quantum states. The states to be discriminated are encoded in the internal degrees of freedom of identical particles, and we use the bunching and antibunching of the external degrees of freedom to discriminate between various internal states. We show that we can achieve the optimal single-shot discrimination probability using only the effects of particle statistics. We discuss interesting applications of our method to detecting entanglement and purifying mixed states. Our scheme can easily be implemented with the current technology
International Nuclear Information System (INIS)
Zhang Zhongzheng; Xue Zheng; Qin Fuzhong
1997-01-01
PURPOSE: To evaluate the effects of early intravenous thrombolytic therapy in acute myocardial infarction (AMI) with 99m Tc-MIBI tomography imaging. METHODS: 22 patients with AMI were observed using 99m Tc-MIBI rest myocardial tomography imaging. The semiquantitative score of myocardial 99m Tc-MIBI uptake was expressed with a four point scoring system. RESULTS: The findings showed in patients in whom reperfusion was achieved, mean scores decreased from 9.1 +- 3.3 before thrombolytic therapy to 3.7 +- 2.2 (t 4.085, P 99m Tc-MIBI perfusion defect segments correlated with that of the ECG-determined infarct site. The comparison between the first and the second myocardial imaging in the non-thrombolytic-treatment group was statistically insignificant. CONCLUSION: The potential advantages of rest myocardial imaging in AMI before and after thrombolytic therapy not only provide an information for assessing the extent of improvement of myocardial ischemia but also provide an imaging basis for determining coronary artery reperfusion
International Nuclear Information System (INIS)
Wada, Y.
1981-01-01
The surface-ionization type mass-spectrometer is widely used as an apparatus for quality assurance, accountability and safeguarding of nuclear materials, and for this analysis it has become an important factor to statistically evaluate an analytical error which consists of a random error and a systematic error. The major factor of this systematic error was the mass-discrimination effect. In this paper, various assays for evaluating the factor of variation on the mass-discrimination effect were studied and the data obtained were statistically evaluated. As a result of these analyses, it was proved that the factor of variation on the mass-discrimination effect was not attributed to the acid concentration of sample, sample size on the filament and supplied voltage for a multiplier, but mainly to the filament temperature during the mass-spectrometric analysis. The mass-discrimination effect values β which were usually calculated from the measured data of uranium, plutonium or boron isotopic standard sample were not so significant dependently of the difference of U-235, Pu-239 or B-10 isotopic abundance. Furthermore, in the case of U and Pu, measurement conditions and the mass range of these isotopes were almost similar, and these values β were not statistically significant between U and Pu. On the other hand, the value β for boron was about a third of the value β for U or Pu, but compared with the coefficient of the correction on the mass-discrimination effect for the difference of mass-number, ΔM, these coefficient values were almost the same among U, Pu, and B.As for the isotopic analysis error of U, Pu, Nd and B, it was proved that the isotopic abundance of these elements and the isotopic analysis error were in a relationship of quadratic curves on a logarithmic-logarithmic scale
Nuclear energy, insignificant use in comparison with mineral fuels
International Nuclear Information System (INIS)
Mostafa, Md.G.
1999-01-01
Human civilization based on energy. From the primitive stage human needs a power or energy source to exist in the world. Energy is the basis of Industrial civilization, without it modern life would cease to exist. Most of the growth of the world energy consumption grew from 20% in 1970 to 31% in 1990. There is general approbation that in the next decades this trend will continue. But we have seen that during the 1970s the world began a painful adjustment to the vulnerability of energy supplies. Last three decades world had fallen into energy crisis three times. So the world has to face a great challenge to (1) Ensure the availability of energy, (2) Reduce the cost of energy, (3) Reduce the impact on environment caused by using fossil fuels. The increase in global demand for energy is expected to be met from several energy sources. And which sources of energy would take part to alleviate the energy need in the world. There are several factors to choose the energy sources. Many European countries now thinking that the future demand of energy would be mitigated by sustainable energy. In this connection many countries have initiated to develop renewable energy technologies that would make possible them to diminish fossil fuel consumption and its attendant problems. Now-a-days the nuclear Engineer and Scientist claimed that there is no danger to be associated with atomic power stations. But still now about the safety and economy of nuclear power have a number of unanswered questions: What are the effects of low-level radiation over long period? How can nuclear power's waste products, which will be dangerous for centuries, be permanently isolated from the environment? Etc. Because of atomic power is till associated in the public mind with the destructive force of atomic bombs, how could the Nuclear power keep an important role in future energy demand in the world? If it is possible to over come the danger of atomic power by ensuring the modern technology and highest
Nuclear energy, insignificant use in comparison with mineral fuels
Energy Technology Data Exchange (ETDEWEB)
Mostafa, Md.G. [DDC Ltd., Dhaka (Bangladesh). Computer Center
1999-07-01
Human civilization based on energy. From the primitive stage human needs a power or energy source to exist in the world. Energy is the basis of Industrial civilization, without it modern life would cease to exist. Most of the growth of the world energy consumption grew from 20% in 1970 to 31% in 1990. There is general approbation that in the next decades this trend will continue. But we have seen that during the 1970s the world began a painful adjustment to the vulnerability of energy supplies. Last three decades world had fallen into energy crisis three times. So the world has to face a great challenge to (1) Ensure the availability of energy, (2) Reduce the cost of energy, (3) Reduce the impact on environment caused by using fossil fuels. The increase in global demand for energy is expected to be met from several energy sources. And which sources of energy would take part to alleviate the energy need in the world. There are several factors to choose the energy sources. Many European countries now thinking that the future demand of energy would be mitigated by sustainable energy. In this connection many countries have initiated to develop renewable energy technologies that would make possible them to diminish fossil fuel consumption and its attendant problems. Now-a-days the nuclear Engineer and Scientist claimed that there is no danger to be associated with atomic power stations. But still now about the safety and economy of nuclear power have a number of unanswered questions: What are the effects of low-level radiation over long period? How can nuclear power's waste products, which will be dangerous for centuries, be permanently isolated from the environment? Etc. Because of atomic power is till associated in the public mind with the destructive force of atomic bombs, how could the Nuclear power keep an important role in future energy demand in the world? If it is possible to over come the danger of atomic power by ensuring the modern technology and highest
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.
MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C
2018-03-29
This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Quantum statistical effects in the mass transport of interstitial solutes in a crystalline solid
Woo, C. H.; Wen, Haohua
2017-09-01
The impact of quantum statistics on the many-body dynamics of a crystalline solid at finite temperatures containing an interstitial solute atom (ISA) is investigated. The Mori-Zwanzig theory allows the many-body dynamics of the crystal to be formulated and solved analytically within a pseudo-one-particle approach using the Langevin equation with a quantum fluctuation-dissipation relation (FDR) based on the Debye model. At the same time, the many-body dynamics is also directly solved numerically via the molecular dynamics approach with a Langevin heat bath based on the quantum FDR. Both the analytical and numerical results consistently show that below the Debye temperature of the host lattice, quantum statistics significantly impacts the ISA transport properties, resulting in major departures from both the Arrhenius law of diffusion and the Einstein-Smoluchowski relation between the mobility and diffusivity. Indeed, we found that below one-third of the Debye temperature, effects of vibrations on the quantum mobility and diffusivity are both orders-of-magnitude larger and practically temperature independent. We have shown that both effects have their physical origin in the athermal lattice vibrations derived from the phonon ground state. The foregoing theory is tested in quantum molecular dynamics calculation of mobility and diffusivity of interstitial helium in bcc W. In this case, the Arrhenius law is only valid in a narrow range between ˜300 and ˜700 K. The diffusivity becomes temperature independent on the low-temperature side while increasing linearly with temperature on the high-temperature side.
Statistics without Tears: Complex Statistics with Simple Arithmetic
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
International Nuclear Information System (INIS)
Nemnes, G A; Anghel, D V
2010-01-01
We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size
International Nuclear Information System (INIS)
Avrigeanu, M.; Avrigeanu, V.
1992-02-01
A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second
Health significance and statistical uncertainty. The value of P-value.
Consonni, Dario; Bertazzi, Pier Alberto
2017-10-27
The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".
Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations
Kuzemsky, A. L.
2018-01-01
We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.
Energy Technology Data Exchange (ETDEWEB)
Kraut, W. [Duale Hochschule Baden-Wuerttemberg (DHBW), Karlsruhe (Germany). Studiengang Sicherheitswesen
2016-07-01
The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Directory of Open Access Journals (Sweden)
Reem Gamal
2017-12-01
Conclusions: Airborne abrasion-surface treatment of zirconia significantly enhanced the μTBS of both cements adhered to dentin while aging had an adverse effect. MS showed higher insignificant μTBS.
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
The Roles of Experience, Gender, and Individual Differences in Statistical Reasoning
Martin, Nadia; Hughes, Jeffrey; Fugelsang, Jonathan
2017-01-01
We examine the joint effects of gender and experience on statistical reasoning. Participants with various levels of experience in statistics completed the Statistical Reasoning Assessment (Garfield, 2003), along with individual difference measures assessing cognitive ability and thinking dispositions. Although the performance of both genders…
Equivalent statistics and data interpretation.
Francis, Gregory
2017-08-01
Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.
Heterogeneous Rock Simulation Using DIP-Micromechanics-Statistical Methods
Directory of Open Access Journals (Sweden)
H. Molladavoodi
2018-01-01
Full Text Available Rock as a natural material is heterogeneous. Rock material consists of minerals, crystals, cement, grains, and microcracks. Each component of rock has a different mechanical behavior under applied loading condition. Therefore, rock component distribution has an important effect on rock mechanical behavior, especially in the postpeak region. In this paper, the rock sample was studied by digital image processing (DIP, micromechanics, and statistical methods. Using image processing, volume fractions of the rock minerals composing the rock sample were evaluated precisely. The mechanical properties of the rock matrix were determined based on upscaling micromechanics. In order to consider the rock heterogeneities effect on mechanical behavior, the heterogeneity index was calculated in a framework of statistical method. A Weibull distribution function was fitted to the Young modulus distribution of minerals. Finally, statistical and Mohr–Coulomb strain-softening models were used simultaneously as a constitutive model in DEM code. The acoustic emission, strain energy release, and the effect of rock heterogeneities on the postpeak behavior process were investigated. The numerical results are in good agreement with experimental data.
Direct Learning of Systematics-Aware Summary Statistics
CERN. Geneva
2018-01-01
Complex machine learning tools, such as deep neural networks and gradient boosting algorithms, are increasingly being used to construct powerful discriminative features for High Energy Physics analyses. These methods are typically trained with simulated or auxiliary data samples by optimising some classification or regression surrogate objective. The learned feature representations are then used to build a sample-based statistical model to perform inference (e.g. interval estimation or hypothesis testing) over a set of parameters of interest. However, the effectiveness of the mentioned approach can be reduced by the presence of known uncertainties that cause differences between training and experimental data, included in the statistical model via nuisance parameters. This work presents an end-to-end algorithm, which leverages on existing deep learning technologies but directly aims to produce inference-optimal sample-summary statistics. By including the statistical model and a differentiable approximation of ...
Pedzikiewicz, J; Sobiech, K A
1995-01-01
Nine men were examined during a three-week training requiring much physical effort. They were given nutrient, "LIVEX", enriched with iron. Hematological parameters as well as concentration of erythrocyte ATP and 2,3-DPG were determined before and after the experiment. Hematological parameters were determined using standard methods while Boehringer's test (Germany) was used for determining ATP and 2,3-DPG. The level of reticular cells was statistically higher after the experiment, and the increase in ATP and 2,3-DPG concentration was insignificant. A positive adaptation of energy metabolism after exogenous iron administration during physical effort was discussed.
Mean-field theory of anyons near Bose statistics
International Nuclear Information System (INIS)
McCabe, J.; MacKenzie, R.
1992-01-01
The validity of a mean-field approximation for a boson-based free anyon gas near Bose statistics is shown. The magnetic properties of the system is discussed in the approximation that the statistical magnetic field is uniform. It is proved that the anyon gas does not exhibit a Meissner effect in the domain of validity the approximation. (K.A.) 7 refs
Krzykawska-Serda, Martyna; Agha, Mahdi S; Ho, Jason Chak-Shing; Ware, Matthew J; Law, Justin J; Newton, Jared M; Nguyen, Lam; Curley, Steven A; Corr, Stuart J
2018-04-02
Patients with pancreatic ductal adenocarcinomas (PDAC) have one of the poorest survival rates of all cancers. The main reason for this is related to the unique tumor stroma and poor vascularization of PDAC. As a consequence, chemotherapeutic drugs, such as nab-paclitaxel and gemcitabine, cannot efficiently penetrate into the tumor tissue. Non-invasive radiofrequency (RF) mild hyperthermia treatment was proposed as a synergistic therapy to enhance drug uptake into the tumor by increasing tumor vascular inflow and perfusion, thus, increasing the effect of chemotherapy. RF-induced hyperthermia is a safer and non-invasive technique of tumor heating compared to conventional contact heating procedures. In this study, we investigated the short- and long-term effects (~20 days and 65 days, respectively) of combination chemotherapy and RF hyperthermia in an orthotopic PDAC model in mice. The benefit of nab-paclitaxel and gemcitabine treatment was confirmed in mice; however, the effect of treatment was statistically insignificant in comparison to saline treated mice during long-term observation. The benefit of RF was minimal in the short-term and completely insignificant during long-term observation. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Statistical physics including applications to condensed matter
Hermann, Claudine
2005-01-01
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies -- as e.g. semiconductors or lasers -- are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.
Excel 2016 for engineering statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching engineering statistics effectively. Similar to the previously published Excel 2013 for Engineering Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However,Excel 2016 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and...
Excel 2016 for business statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching business statistics effectively. Similar to the previously published Excel 2010 for Business Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical business problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in business courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Business Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each ch...
Excel 2016 for marketing statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This is the first book to show the capabilities of Microsoft Excel in teaching marketing statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical marketing problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in marketing courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Marketing Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader t...
Excel 2013 for engineering statistics a guide to solving practical problems
Quirk, Thomas J
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach engineering statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs...
A comparative study on microwave and routine tissue processing
Directory of Open Access Journals (Sweden)
T Mahesh Babu
2011-01-01
Conclusions: The individual scores by different observers regarding the various parameters included in the study were statistically insignificant, the overall quality of microwave-processed and microwave-stained slides appeared slightly better than conventionally processed and stained slides.
Clinicopathological comparison of triple negative breast cancers ...
African Journals Online (AJOL)
2014-11-10
Nov 10, 2014 ... recorded more in non‑TNBC patients as compared to TNBC patients, but the results were statistically insignificant. Conclusion: .... In TNBC group, 60 (90.7%) cases were married whereas. 2 (3.2%) cases were unmarried.
Johnson, Norman
This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...
Shaw, R. L.
1979-01-01
A sample of 228 supernovae that occurred in galaxies with known redshifts is used to show that the mean projected linear supernova distance from the center of the parent galaxy increases with increasing redshift. This effect is interpreted as an observational bias: the discovery rate of supernovae is reduced in the inner parts of distant, poorly resolved galaxies. Even under the optimistic assumption that no selection effects work in galaxies closer than 33 Mpc, about 50% of all supernovae are lost in the inner regions of galaxies beyond 150 Mpc. This observational bias must be taken into account in the derivation of statistical properties of supernovae.
Nonextensive statistical mechanics of ionic solutions
International Nuclear Information System (INIS)
Varela, L.M.; Carrete, J.; Munoz-Sola, R.; Rodriguez, J.R.; Gallego, J.
2007-01-01
Classical mean-field Poisson-Boltzmann theory of ionic solutions is revisited in the theoretical framework of nonextensive Tsallis statistics. The nonextensive equivalent of Poisson-Boltzmann equation is formulated revisiting the statistical mechanics of liquids and the Debye-Hueckel framework is shown to be valid for highly diluted solutions even under circumstances where nonextensive thermostatistics must be applied. The lowest order corrections associated to nonadditive effects are identified for both symmetric and asymmetric electrolytes and the behavior of the average electrostatic potential in a homogeneous system is analytically and numerically analyzed for various values of the complexity measurement nonextensive parameter q
Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos
2018-01-01
When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical
19 CFR 351.413 - Disregarding insignificant adjustments.
2010-04-01
... COUNTERVAILING DUTIES Calculation of Export Price, Constructed Export Price, Fair Value, and Normal Value § 351..., constructed export price, or normal value, as the case may be. Groups of adjustments are adjustments for...
DEFF Research Database (Denmark)
Nielsen, Tine; Kreiner, Svend
Short abstract Motivated by experiencing with students’ psychological barriers for learning statistics we modified and extended the Statistical Anxiety Rating Scale (STARS) to develop a contemporary Danish measure of attitudes and relationship to statistics for use with higher education students...... with evidence of DIF in all cases: One TCA-item functioned differentially relative to age, one WS-item functioned differentially relative to statistics course (first or second), and two IA-items functioned differentially relative to statistics course and academic discipline (sociology, public health...
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
The influence of religious affiliation on heavy drinking, heavy smoking and heavy betel nut chewing.
Chen, Chiang-Ming
2014-01-01
The results of a national survey of determinants of drinking, smoking and betel-nut chewing behaviors are analyzed. The purpose of this paper is to empirically investigate whether drinking, smoking and betel-nut chewing are influenced by a variety of religions based on Taiwan data. Our results suggest that Buddhism, Taoism and practitioners of Chinese folk region are positively associated with heavy betel nut chewing while the religion effects on heavy smoking and drinking are statistically insignificant. Our findings on religion effects in Taiwan can be a valuable reference for comparison in Christian and western countries. © 2013 Elsevier Ltd. All rights reserved.
Effect of variations in rainfall intensity on slope stability in Singapore
Directory of Open Access Journals (Sweden)
Christofer Kristo
2017-12-01
Full Text Available Numerous scientific evidence has given credence to the true existence and deleterious impacts of climate change. One aspect of climate change is the variations in rainfall patterns, which affect the flux boundary condition across ground surface. A possible disastrous consequence of this change is the occurrence of rainfall-induced slope failures. This paper aims to investigate the variations in rainfall patterns in Singapore and its effect on slope stability. Singapore's historical rainfall data from Seletar and Paya Lebar weather stations for the period of 1985â2009 were obtained and analysed by duration using linear regression. A general increasing trend was observed in both weather stations, with a possible shift to longer duration rainfall events, despite being statistically insignificant according to the Mann-Kendall test. Using the derived trends, projected rainfall intensities in 2050 and 2100 were used in the seepage and slope stability analyses performed on a typical residual soil slope in Singapore. A significant reduction in factor of safety was observed in the next 50 years, with only a marginal decrease in factor of safety in the subsequent 50 years. This indicates a possible detrimental effect of variations in rainfall patterns on slope stability in Singapore, especially in the next 50 years. The statistical analyses on rainfall data from Seletar and Paya Lebar weather stations for the period of 1985â2009 indicated that rainfall intensity tend to increase over the years, with a possible shift to longer duration rainfall events in the future. The stability analyses showed a significant decrease in factor of safety from 2003 to 2050 due to increase in rainfall intensity, suggesting that a climate change might have existed beyond 2009 with possibly detrimental effects to slope stability. Keywords: Climate change, Rainfall, Seepage, Slope stability
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
Preparation and In Vivo Pharmacokinetics of the Tongshu Suppository
Directory of Open Access Journals (Sweden)
Guoqiang Liu
2016-01-01
Full Text Available Astragalus polysaccharide (APS (used for intestinal protection was added to formulate the Tongshu suppository to improve the pharmacokinetics of Aceclofenac, which were assessed in New Zealand rabbits using an orthogonal experimental design. The single-agent Aceclofenac was taken as the control formulation. The concentration-time and drug release curves were drawn, and Tmax (min, Cmax (μg·mL−1, AUC0→∞, and MRT were compared using a pharmacokinetic systems program. The formulated Tongshu suppository had moderate hardness, a smooth surface with uniform color, and theoretical drug-loading rate of 8%. Its release rate was in accordance with the drug preparation requirements. The concentration-time curves and drug release curves revealed that the maximum concentrations (Cmax were 4.18±1.03 μg·mL−1 and 3.34±0.41 μg·mL−1 for the Tongshu and Aceclofenac suppositories, respectively, showing statistically insignificant difference, while the peak times were 34.87±4.69 min and 34.76±6.34 min, respectively, also showing statistically insignificant difference. Compared with the Aceclofenac suppository, the relative bioavailability of the Tongshu suppository was 104.4%, and the difference between them was statistically insignificant. In this experiment, the Tongshu suppository was prepared using the hot-melt method. In vivo pharmacokinetic studies confirmed it had higher bioavailability than the Aceclofenac suppository.
Preparation and In Vivo Pharmacokinetics of the Tongshu Suppository
Dong, Leilei; Lu, Kuan; Liu, Sisi; Zheng, Yingying
2016-01-01
Astragalus polysaccharide (APS) (used for intestinal protection) was added to formulate the Tongshu suppository to improve the pharmacokinetics of Aceclofenac, which were assessed in New Zealand rabbits using an orthogonal experimental design. The single-agent Aceclofenac was taken as the control formulation. The concentration-time and drug release curves were drawn, and T max (min), C max (μg·mL−1), AUC0→∞, and MRT were compared using a pharmacokinetic systems program. The formulated Tongshu suppository had moderate hardness, a smooth surface with uniform color, and theoretical drug-loading rate of 8%. Its release rate was in accordance with the drug preparation requirements. The concentration-time curves and drug release curves revealed that the maximum concentrations (C max) were 4.18 ± 1.03 μg·mL−1 and 3.34 ± 0.41 μg·mL−1 for the Tongshu and Aceclofenac suppositories, respectively, showing statistically insignificant difference, while the peak times were 34.87 ± 4.69 min and 34.76 ± 6.34 min, respectively, also showing statistically insignificant difference. Compared with the Aceclofenac suppository, the relative bioavailability of the Tongshu suppository was 104.4%, and the difference between them was statistically insignificant. In this experiment, the Tongshu suppository was prepared using the hot-melt method. In vivo pharmacokinetic studies confirmed it had higher bioavailability than the Aceclofenac suppository. PMID:27610366
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
ASYMPTOTIC COMPARISONS OF U-STATISTICS, V-STATISTICS AND LIMITS OF BAYES ESTIMATES BY DEFICIENCIES
Toshifumi, Nomachi; Hajime, Yamato; Graduate School of Science and Engineering, Kagoshima University:Miyakonojo College of Technology; Faculty of Science, Kagoshima University
2001-01-01
As estimators of estimable parameters, we consider three statistics which are U-statistic, V-statistic and limit of Bayes estimate. This limit of Bayes estimate, called LB-statistic in this paper, is obtained from Bayes estimate of estimable parameter based on Dirichlet process, by letting its parameter tend to zero. For the estimable parameter with non-degenerate kernel, the asymptotic relative efficiencies of LB-statistic with respect to U-statistic and V-statistic and that of V-statistic w...
CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY
Directory of Open Access Journals (Sweden)
ILEANA BRUDIU
2009-05-01
Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Statistical inference for template aging
Schuckers, Michael E.
2006-04-01
A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.
Introduction to Statistically Designed Experiments
Energy Technology Data Exchange (ETDEWEB)
Heaney, Mike
2016-09-13
Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.
Tattar, Prabhanjan N; Manjunath, B G
2016-01-01
Integrates the theory and applications of statistics using R A Course in Statistics with R has been written to bridge the gap between theory and applications and explain how mathematical expressions are converted into R programs. The book has been primarily designed as a useful companion for a Masters student during each semester of the course, but will also help applied statisticians in revisiting the underpinnings of the subject. With this dual goal in mind, the book begins with R basics and quickly covers visualization and exploratory analysis. Probability and statistical inference, inclusive of classical, nonparametric, and Bayesian schools, is developed with definitions, motivations, mathematical expression and R programs in a way which will help the reader to understand the mathematical development as well as R implementation. Linear regression models, experimental designs, multivariate analysis, and categorical data analysis are treated in a way which makes effective use of visualization techniques and...
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Effect of the Target Motion Sampling temperature treatment method on the statistics and performance
International Nuclear Information System (INIS)
Viitanen, Tuomas; Leppänen, Jaakko
2015-01-01
Highlights: • Use of the Target Motion Sampling (TMS) method with collision estimators is studied. • The expected values of the estimators agree with NJOY-based reference. • In most practical cases also the variances of the estimators are unaffected by TMS. • Transport calculation slow-down due to TMS dominates the impact on figures-of-merit. - Abstract: Target Motion Sampling (TMS) is a stochastic on-the-fly temperature treatment technique that is being developed as a part of the Monte Carlo reactor physics code Serpent. The method provides for modeling of arbitrary temperatures in continuous-energy Monte Carlo tracking routines with only one set of cross sections stored in the computer memory. Previously, only the performance of the TMS method in terms of CPU time per transported neutron has been discussed. Since the effective cross sections are not calculated at any point of a transport simulation with TMS, reaction rate estimators must be scored using sampled cross sections, which is expected to increase the variances and, consequently, to decrease the figures-of-merit. This paper examines the effects of the TMS on the statistics and performance in practical calculations involving reaction rate estimation with collision estimators. Against all expectations it turned out that the usage of sampled response values has no practical effect on the performance of reaction rate estimators when using TMS with elevated basis cross section temperatures (EBT), i.e. the usual way. With 0 Kelvin cross sections a significant increase in the variances of capture rate estimators was observed right below the energy region of unresolved resonances, but at these energies the figures-of-merit could be increased using a simple resampling technique to decrease the variances of the responses. It was, however, noticed that the usage of the TMS method increases the statistical deviances of all estimators, including the flux estimator, by tens of percents in the vicinity of very
Features of statistical dynamics in a finite system
International Nuclear Information System (INIS)
Yan, Shiwei; Sakata, Fumihiko; Zhuo Yizhong
2002-01-01
We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time
Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford
2015-06-01
.5 speciation data from 1995-1997. The Houston data included respiratory mortality data and 24-hour PM2.5 speciation data sampled every six days from a region near the Houston Ship Channel in years 2002-2005. We also developed a Bayesian spatial multivariate receptor modeling approach that, while simultaneously dealing with the unknown number of sources and identifiability conditions, incorporated spatial correlations in the multipollutant data collected from multiple sites into the estimation of source profiles and contributions based on the discrete process convolution model for multivariate spatial processes. This new modeling approach was applied to 24-hour ambient air concentrations of 17 volatile organic compounds (VOCs) measured at nine monitoring sites in Harris County, Texas, during years 2000 to 2005. Simulation results indicated that our methods were accurate in identifying the true model and estimated parameters were close to the true values. The results from our methods agreed in general with previous studies on the source apportionment of the Phoenix data in terms of estimated source profiles and contributions. However, we had a greater number of statistically insignificant findings, which was likely a natural consequence of incorporating uncertainty in the estimated source contributions into the health-effects parameter estimation. For the Houston data, a model with five sources (that seemed to be Sulfate-Rich Secondary Aerosol, Motor Vehicles, Industrial Combustion, Soil/Crustal Matter, and Sea Salt) showed the highest posterior model probability among the candidate models considered when fitted simultaneously to the PM2.5 and mortality data. There was a statistically significant positive association between respiratory mortality and same-day PM2.5 concentrations attributed to one of the sources (probably industrial combustion). The Bayesian spatial multivariate receptor modeling approach applied to the VOC data led to a highest posterior model probability for a
DEFF Research Database (Denmark)
Toft-Nielsen, M; Hvidberg, A; Hilsted, Jannik
1996-01-01
GLP-1 administration decreases blood glucose levels in normal subjects and non-insulin-dependent diabetes mellitus patients and is therefore proposed as a treatment for diabetic hyperglycaemia. The glucose lowering effect of GLP-1 is glucose dependent and therefore self-limiting, but it is not kn...... on the two GLP-1 infusion days; and (5) an increase in catecholamine levels in the GLP-1/saline experiment and also in the beta-blockade experiments. We conclude that adrenergic counterregulation plays an insignificant role in curtailing GLP-1's glucose lowering effect....
International Nuclear Information System (INIS)
Iqbal, M.N.; Sajid, M.T.; Ahmed, Z.; Iqbal, M.H.
2016-01-01
Objective: To compare the efficacy of vacuum assisted closure (VAC) therapy against regular moist wound dressings in reducing the surface area of open chronic wounds by at least 5 mm/sup 2/ in terms of early closure of wound. Study Design: Randomized controlled trail. Place and Duration of Study: This study was conducted at general surgery department CMH/MH Rawalpindi from Jun 2011 to Dec 2011 over a period of 06 months. Material and Methods: A total of 278 patients (139 in each group) were included in this study. Group A received VAC therapy while moist wound dressings applied in group B. Results: Mean age was 54.9 +-7.2 and 53.4 +- 8.9 years in group A and B, respectively (statistically insignificant (p=0.12). In group A, 96 patients (69.0 percent) and in group B 92 patients (66.2 percent) were male while 43 patients (31.0 percent) in group A and 47 patients (33.8 percent) in group B were female the difference being statistically insignificant (p=0.608). In group A, 63 (45.3 percent) patients showed significant reduction in the size of the wound while only 41 (29.5 percent) patients in group B had adequate wound healing at the end of 04 weeks, the difference being statistically significant (p=0.0064). Conclusion: VAC therapy decreases wound size more effectively than moist wound dressing technique. It definitely reduces hospital stay and ensures early return to work. (author)
Oreopoulos, Lazaros
2004-01-01
The MODIS Level-3 optical thickness and effective radius cloud product is a gridded l deg. x 1 deg. dataset that is derived from aggregation and subsampling at 5 km of 1 km, resolution Level-2 orbital swath data (Level-2 granules). This study examines the impact of the 5 km subsampling on the mean, standard deviation and inhomogeneity parameter statistics of optical thickness and effective radius. The methodology is simple and consists of estimating mean errors for a large collection of Terra and Aqua Level-2 granules by taking the difference of the statistics at the original and subsampled resolutions. It is shown that the Level-3 sampling does not affect the various quantities investigated to the same degree, with second order moments suffering greater subsampling errors, as expected. Mean errors drop dramatically when averages over a sufficient number of regions (e.g., monthly and/or latitudinal averages) are taken, pointing to a dominance of errors that are of random nature. When histograms built from subsampled data with the same binning rules as in the Level-3 dataset are used to reconstruct the quantities of interest, the mean errors do not deteriorate significantly. The results in this paper provide guidance to users of MODIS Level-3 optical thickness and effective radius cloud products on the range of errors due to subsampling they should expect and perhaps account for, in scientific work with this dataset. In general, subsampling errors should not be a serious concern when moderate temporal and/or spatial averaging is performed.
Directory of Open Access Journals (Sweden)
H Mozaffari
2008-10-01
Full Text Available Introduction: Hyperlipidemia is a risk factor for atherosclerosis and cardiovascular diseases. Nuts such as almonds are high in unsaturated lipids and antioxidants. Some studies indicate that nuts have beneficial effects on cardiovascular system. Therefore, the aim of this study was evaluation of the effectiveness of shelled almonds on reduction of blood lipid and lipoprotein levels in hyperlipidemic patients. Methods: This study was a clinical trial (before and after and was done on 30 men volunteering for the study. They consumed 60 grams shelled almonds per day for four weeks. Their blood lipid, lipoprotein, apolipoprotein and lipoprotein (a levels were measured after and before almond consumption. Results: Shelled almond consumption caused significant decrease in serum cholesterol 36.1 mg/dl, triglyceride 45.94 mg/dl, LDL-cholesterol 28.68 mg/dl and increase in HDL-cholesterol 10.64 mg/dl(p<0.001. Shelled almond consumption decreased lipoprotein (a (2.11 mg/dl, apolipoprotein B100 (8.93 mg/dl and increased apolipoprotein A (1 1.74 mg/dl levels, but this effect was insignificant statistically. Conclusion: Continuous consumption of shelled almonds has beneficial effect on blood lipids and may play a preventive role in Atherosclerosis and coronary heart diseases. We therefore suggest that a daily intake of 60 grams of almonds can be used for treatment of hyprelipidemic patients.
Decision Support Systems: Applications in Statistics and Hypothesis Testing.
Olsen, Christopher R.; Bozeman, William C.
1988-01-01
Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…
The Generalized Quantum Statistics
Hwang, WonYoung; Ji, Jeong-Young; Hong, Jongbae
1999-01-01
The concept of wavefunction reduction should be introduced to standard quantum mechanics in any physical processes where effective reduction of wavefunction occurs, as well as in the measurement processes. When the overlap is negligible, each particle obey Maxwell-Boltzmann statistics even if the particles are in principle described by totally symmetrized wavefunction [P.R.Holland, The Quantum Theory of Motion, Cambridge Unversity Press, 1993, p293]. We generalize the conjecture. That is, par...
Reading Statistics And Research
Akbulut, Reviewed By Yavuz
2008-01-01
The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Morsanyi, Kinga; Primi, Caterina; Chiesi, Francesca; Handley, Simon
2009-01-01
In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists'…
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
Oseloka Ezepue, Patrick; Ojo, Adegbola
2012-12-01
A challenging problem in some developing countries such as Nigeria is inadequate training of students in effective problem solving using the core concepts of their disciplines. Related to this is a disconnection between their learning and socio-economic development agenda of a country. These problems are more vivid in statistical education which is dominated by textbook examples and unbalanced assessment 'for' and 'of' learning within traditional curricula. The problems impede the achievement of socio-economic development objectives such as those stated in the Nigerian Vision 2020 blueprint and United Nations Millennium Development Goals. They also impoverish the ability of (statistics) graduates to creatively use their knowledge in relevant business and industry sectors, thereby exacerbating mass graduate unemployment in Nigeria and similar developing countries. This article uses a case study in statistical modelling to discuss the nature of innovations in statistics education vital to producing new kinds of graduates who can link their learning to national economic development goals, create wealth and alleviate poverty through (self) employment. Wider implications of the innovations for repositioning mathematical sciences education globally are explored in this article.
On the Statistical Properties of Cospectra
Huppenkothen, D.; Bachetti, M.
2018-05-01
In recent years, the cross-spectrum has received considerable attention as a means of characterizing the variability of astronomical sources as a function of wavelength. The cospectrum has only recently been understood as a means of mitigating instrumental effects dependent on temporal frequency in astronomical detectors, as well as a method of characterizing the coherent variability in two wavelength ranges on different timescales. In this paper, we lay out the statistical foundations of the cospectrum, starting with the simplest case of detecting a periodic signal in the presence of white noise, under the assumption that the same source is observed simultaneously in independent detectors in the same energy range. This case is especially relevant for detecting faint X-ray pulsars in detectors heavily affected by instrumental effects, including NuSTAR, Astrosat, and IXPE, which allow for even sampling and where the cospectrum can act as an effective way to mitigate dead time. We show that the statistical distributions of both single and averaged cospectra differ considerably from those for standard periodograms. While a single cospectrum follows a Laplace distribution exactly, averaged cospectra are approximated by a Gaussian distribution only for more than ∼30 averaged segments, dependent on the number of trials. We provide an instructive example of a quasi-periodic oscillation in NuSTAR and show that applying standard periodogram statistics leads to underestimated tail probabilities for period detection. We also demonstrate the application of these distributions to a NuSTAR observation of the X-ray pulsar Hercules X-1.
Percolation, statistical topography, and transport in random media
International Nuclear Information System (INIS)
Isichenko, M.B.
1992-01-01
A review of classical percolation theory is presented, with an emphasis on novel applications to statistical topography, turbulent diffusion, and heterogeneous media. Statistical topography involves the geometrical properties of the isosets (contour lines or surfaces) of a random potential ψ(x). For rapidly decaying correlations of ψ, the isopotentials fall into the same universality class as the perimeters of percolation clusters. The topography of long-range correlated potentials involves many length scales and is associated either with the correlated percolation problem or with Mandelbrot's fractional Brownian reliefs. In all cases, the concept of fractal dimension is particularly fruitful in characterizing the geometry of random fields. The physical applications of statistical topography include diffusion in random velocity fields, heat and particle transport in turbulent plasmas, quantum Hall effect, magnetoresistance in inhomogeneous conductors with the classical Hall effect, and many others where random isopotentials are relevant. A geometrical approach to studying transport in random media, which captures essential qualitative features of the described phenomena, is advocated
Statistical analysis of natural disasters and related losses
Pisarenko, VF
2014-01-01
The study of disaster statistics and disaster occurrence is a complicated interdisciplinary field involving the interplay of new theoretical findings from several scientific fields like mathematics, physics, and computer science. Statistical studies on the mode of occurrence of natural disasters largely rely on fundamental findings in the statistics of rare events, which were derived in the 20th century. With regard to natural disasters, it is not so much the fact that the importance of this problem for mankind was recognized during the last third of the 20th century - the myths one encounters in ancient civilizations show that the problem of disasters has always been recognized - rather, it is the fact that mankind now possesses the necessary theoretical and practical tools to effectively study natural disasters, which in turn supports effective, major practical measures to minimize their impact. All the above factors have resulted in considerable progress in natural disaster research. Substantial accrued ma...
Effects of Medicare payment changes on nursing home staffing and deficiencies.
Konetzka, R Tamara; Yi, Deokhee; Norton, Edward C; Kilpatrick, Kerry E
2004-06-01
To investigate the effects of Medicare's Prospective Payment System (PPS) for skilled nursing facilities (SNFs) and associated rate changes on quality of care as represented by staffing ratios and regulatory deficiencies. Online Survey, Certification and Reporting (OSCAR) data from 1996-2000 were linked with Area Resource File (ARF) and Medicare Cost Report data to form a panel dataset. A difference-in-differences model was used to assess effects of the PPS and the BBRA (Balanced Budget Refinement Act) on staffing and deficiencies, a design that allows the separation of the effects of the policies from general trends. Ordinary least squares and negative binomial models were used. The OSCAR and Medicare Cost Report data are self-reported by nursing facilities; ARF data are publicly available. Data were linked by provider ID and county. We find that professional staffing decreased and regulatory deficiencies increased with PPS, and that both effects were mitigated with the BBRA rate increases. The effects appear to increase with the percent of Medicare residents in the facility except, in some cases, at the highest percentage of Medicare. The findings on staffing are statistically significant. The effects on deficiencies, though exhibiting consistent signs and magnitudes with the staffing results, are largely insignificant. Medicare's PPS system and associated rate cuts for SNFs have had a negative effect on staffing and regulatory compliance. Further research is necessary to determine whether these changes are associated with worse outcomes. Findings from this investigation could help guide policy modifications that support the provision of quality nursing home care.
Incorporating Code-Based Software in an Introductory Statistics Course
Doehler, Kirsten; Taylor, Laura
2015-01-01
This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Directory of Open Access Journals (Sweden)
N. A. Azeez
2017-04-01
Full Text Available Data compression is the process of reducing the size of a file to effectively reduce storage space and communication cost. The evolvement in technology and digital age has led to an unparalleled usage of digital files in this current decade. The usage of data has resulted to an increase in the amount of data being transmitted via various channels of data communication which has prompted the need to look into the current lossless data compression algorithms to check for their level of effectiveness so as to maximally reduce the bandwidth requirement in communication and transfer of data. Four lossless data compression algorithm: Lempel-Ziv Welch algorithm, Shannon-Fano algorithm, Adaptive Huffman algorithm and Run-Length encoding have been selected for implementation. The choice of these algorithms was based on their similarities, particularly in application areas. Their level of efficiency and effectiveness were evaluated using some set of predefined performance evaluation metrics namely compression ratio, compression factor, compression time, saving percentage, entropy and code efficiency. The algorithms implementation was done in the NetBeans Integrated Development Environment using Java as the programming language. Through the statistical analysis performed using Boxplot and ANOVA and comparison made on the four algo
Directory of Open Access Journals (Sweden)
I. Pavlova
2017-12-01
Full Text Available The function of immune system of poultry has a significant impact on poultry husbandry sustainabi¬lity. Therefore the aim of this study was to investigate the effect of lactic acid bacteria administered with enrofloxacin or doxycycline on expression levels of antimicrobial peptide cathelicidin-3 (CATH3 at mRNA level in the duodenum, jejunum and liver of broilers. A day-old Ross (n=24 and Duc (n=24 chickens were included in experiments with enrofloxacin and doxycycline, respectively. They were divided into four groups (n=6 for each experiment: control, supplemented with probiotics (15 days via feed, 5 days after hatching, treated with either enrofloxacin or doxycycline (10 mg.kg-1 for 5 days, via drinking water and treated with antibiotic and probiotics. Expression levels of CATH3 mRNA in liver, duodenum and jejunum were determined by RT-PCR and were statistically evaluated by Mann-Whitney test.Administration of probiotics led to insignificant down-regulation of CATH3 mRNA in the investigated tissues. The combination of doxycycline with probiotics led to statistically significant down-regulation of CATH3 mRNA in the duodenum (P<0.01. Statistically significant up-regulation of mRNA of the studied gene was found in the jejunum of enrofloxacin treated Ross chickens. The data suggest the existence of an interaction between antibiotics and innate immunity. Further evaluation in infected poultry would shed more light on the pharmacodynamics of antibacterials.
de Roon, F.A.; Veld, C.H.
1995-01-01
This study investigates the announcement effects of offerings of convertible bond loans and warrant-bond loans using data for the Dutch market. Using standard event study methodology it is found that on average stock prices show a positive but insignificant abnormal return for the announcement of a
The scientifiv way of thinking in statistics, statistical physics and quantum mechanics
Săvoiu, Gheorghe
2008-01-01
This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...
The scientific way of thinking in statistics, statistical physics and quantum mechanics
Săvoiu, Gheorghe
2008-01-01
This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...
Semiclassical statistical mechanics
International Nuclear Information System (INIS)
Stratt, R.M.
1979-04-01
On the basis of an approach devised by Miller, a formalism is developed which allows the nonperturbative incorporation of quantum effects into equilibrium classical statistical mechanics. The resulting expressions bear a close similarity to classical phase space integrals and, therefore, are easily molded into forms suitable for examining a wide variety of problems. As a demonstration of this, three such problems are briefly considered: the simple harmonic oscillator, the vibrational state distribution of HCl, and the density-independent radial distribution function of He 4 . A more detailed study is then made of two more general applications involving the statistical mechanics of nonanalytic potentials and of fluids. The former, which is a particularly difficult problem for perturbative schemes, is treated with only limited success by restricting phase space and by adding an effective potential. The problem of fluids, however, is readily found to yield to a semiclassical pairwise interaction approximation, which in turn permits any classical many-body model to be expressed in a convenient form. The remainder of the discussion concentrates on some ramifications of having a phase space version of quantum mechanics. To test the breadth of the formulation, the task of constructing quantal ensemble averages of phase space functions is undertaken, and in the process several limitations of the formalism are revealed. A rather different approach is also pursued. The concept of quantum mechanical ergodicity is examined through the use of numerically evaluated eigenstates of the Barbanis potential, and the existence of this quantal ergodicity - normally associated with classical phase space - is verified. 21 figures, 4 tables
Statistical analysis of dynamic parameters of the core
International Nuclear Information System (INIS)
Ionov, V.S.
2007-01-01
The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)
Critical analysis of adsorption data statistically
Kaushal, Achla; Singh, S. K.
2017-10-01
Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
The effect of sauna bathing on lipid profile in young, physically active, male subjects
Directory of Open Access Journals (Sweden)
Dorota Gryka
2014-08-01
Full Text Available Objectives: The aim of the study was to evaluate effects of Finnish sauna bathing on lipid profile in healthy, young men. Material and Methods: Sixteen male subjects (20–23 years were subjected to 10 sauna bathing sessions in a Finnish sauna every 1 or 2 days. The mean sauna temperature was 90±2°C, while humidity was 5–16%. Each session consisted of three 15-minute parts and a 2-minute cool-down between them. The following measurements were taken before and after the sauna sessions: body mass, heart rate, body skinfold thickness. The percentage fat content and then, the lean body mass were calculated. Total cholesterol, triacylglycerols, lipoprotein cholesterol LDL and HDL were measured in blood samples. Results: A statistically significant decrease of total cholesterol and LDL cholesterol was observed during 3 weeks of sauna treatment and in the week afterwards. A significant decline in triacylglycerols was found directly after the 1st and 24 h directly after the 10th sauna session. After the 10th sauna session the level of HDL cholesterol remained slightly increased, but this change was not statistically significant. A decrease in blood plasma volume was found directly after the 1st and the last sauna bathing session due to perspiration. An adaptive increase in blood plasma volume was also found after the series of 10 sauna sessions. Conclusions: Ten complete sauna bathing sessions in a Finnish sauna caused a reduction in total cholesterol and LDL cholesterol fraction levels during the sessions and a gradual return of these levels to the initial level during the 1st and the 2nd week after the experiment. A small, statistically insignificant increase in HDL-C level and a transient decline in triacylglycerols were observed after those sauna sessions. The positive effect of sauna on lipid profile is similar to the effect that can be obtained through a moderate-intensity physical exercise.
International Nuclear Information System (INIS)
Dunne, Lawrence J; Axelsson, Anna-Karin; Alford, Neil McN; Valant, Matjaz; Manos, George
2011-01-01
Despite considerable effort, the microscopic origin of the electrocaloric (EC) effect in ferroelectric relaxors is still intensely discussed. Ferroelectric relaxors typically display a dual-peak EC effect, whose origin is uncertain. Here we present an exact statistical mechanical matrix treatment of a lattice model of polar nanoregions forming in a neutral background and use this approach to study the characteristics of the EC effect in ferroelectric relaxors under varying electric field and pressure. The dual peaks seen in the EC properties of ferroelectric relaxors are due to the formation and ordering of polar nanoregions. The model predicts significant enhancement of the EC temperature rise with pressure which may have some contribution to the giant EC effect.
Nonparametric statistics with applications to science and engineering
Kvam, Paul H
2007-01-01
A thorough and definitive book that fully addresses traditional and modern-day topics of nonparametric statistics This book presents a practical approach to nonparametric statistical analysis and provides comprehensive coverage of both established and newly developed methods. With the use of MATLAB, the authors present information on theorems and rank tests in an applied fashion, with an emphasis on modern methods in regression and curve fitting, bootstrap confidence intervals, splines, wavelets, empirical likelihood, and goodness-of-fit testing. Nonparametric Statistics with Applications to Science and Engineering begins with succinct coverage of basic results for order statistics, methods of categorical data analysis, nonparametric regression, and curve fitting methods. The authors then focus on nonparametric procedures that are becoming more relevant to engineering researchers and practitioners. The important fundamental materials needed to effectively learn and apply the discussed methods are also provide...
Alayoubi, Alaadin; Abu-Fayyad, Ahmed; Rawas-Qalaji, Mutasem M; Sylvester, Paul W; Nazzal, Sami
2015-01-01
Recently there has been a growing interest in vitamin E for its potential use in cancer therapy. The objective of this work was therefore to formulate a physically stable parenteral lipid emulsion to deliver higher doses of vitamin E than commonly used in commercial products. Specifically, the objectives were to study the effects of homogenization pressure, number of homogenizing cycles, viscosity of the oil phase, and oil content on the physical stability of emulsions fortified with high doses of vitamin E (up to 20% by weight). This was done by the use of a 27-run, 4-factor, 3-level Box-Behnken statistical design. Viscosity, homogenization pressure, and number of cycles were found to have a significant effect on particle size, which ranged from 213 to 633 nm, and on the percentage of vitamin E remaining emulsified after storage, which ranged from 17 to 100%. Increasing oil content from 10 to 20% had insignificant effect on the responses. Based on the results it was concluded that stable vitamin E rich emulsions could be prepared by repeated homogenization at higher pressures and by lowering the viscosity of the oil phase, which could be adjusted by blending the viscous vitamin E with medium-chain triglycerides (MCT).
Statistical Reasoning Ability, Self-Efficacy, and Value Beliefs in a University Statistics Course
Olani, A.; Hoekstra, R.; Harskamp, E.; van der Werf, G.
2011-01-01
Introduction: The study investigated the degree to which students' statistical reasoning abilities, statistics self-efficacy, and perceived value of statistics improved during a reform based introductory statistics course. The study also examined whether the changes in these learning outcomes differed with respect to the students' mathematical…
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
High cumulants of conserved charges and their statistical uncertainties
Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu
2017-10-01
We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)
Rationale for statistical characteristics of road safety parameters
Directory of Open Access Journals (Sweden)
Dormidontova Tatiana
2017-01-01
Full Text Available When making engineering decisions at the stage of designing auto-roads and man-made structures it is necessary to take into account the statistical variability of physical and mechanical characteristics of the used materials as well as the different effects on the structures. Thus the rationale for the statistical characteristics of the parameters that determine the reliability of roads and man-made engineering facilities is of particular importance.There are many factors to be considered while designing roads, such as natural climatic factors, the accidental effects of the operating loads, the strength and deformation characteristics of the materials, the geometric parameters of the structure, etc. which affect the strength characteristics of roads and man-made structures. The rationale for statistical characteristics of the parameters can help an engineer assess the reliability of the decision and the economic risk, as well as avoid making mistakes in the design of roads and man-made structures.However, some statistical characteristics of the parameters that define the reliability of a road and man-made structures play a key role in the design. These are the visibility distance in daytime for the peak curve, variation coefficient of radial acceleration, the reliability of visibility distance and other parameters.
Koo, Bryan Bonsuk
Electricity generation from non-hydro renewable sources has increased rapidly in the last decade. For example, Renewable Energy Sources for Electricity (RES-E) generating capacity in the U.S. almost doubled for the last three year from 2009 to 2012. Multiple papers point out that RES-E policies implemented by state governments play a crucial role in increasing RES-E generation or capacity. This study examines the effects of state RES-E policies on state RES-E generating capacity, using a fixed effects model. The research employs panel data from the 50 states and the District of Columbia, for the period 1990 to 2011, and uses a two-stage approach to control endogeneity embedded in the policies adopted by state governments, and a Prais-Winsten estimator to fix any autocorrelation in the panel data. The analysis finds that Renewable Portfolio Standards (RPS) and Net-metering are significantly and positively associated with RES-E generating capacity, but neither Public Benefit Funds nor the Mandatory Green Power Option has a statistically significant relation to RES-E generating capacity. Results of the two-stage model are quite different from models which do not employ predicted policy variables. Analysis using non-predicted variables finds that RPS and Net-metering policy are statistically insignificant and negatively associated with RES-E generating capacity. On the other hand, Green Energy Purchasing policy is insignificant in the two-stage model, but significant in the model without predicted values.
Kim, E.; Newton, A. P.
2012-04-01
One major problem in dynamo theory is the multi-scale nature of the MHD turbulence, which requires statistical theory in terms of probability distribution functions. In this contribution, we present the statistical theory of magnetic fields in a simplified mean field α-Ω dynamo model by varying the statistical property of alpha, including marginal stability and intermittency, and then utilize observational data of solar activity to fine-tune the mean field dynamo model. Specifically, we first present a comprehensive investigation into the effect of the stochastic parameters in a simplified α-Ω dynamo model. Through considering the manifold of marginal stability (the region of parameter space where the mean growth rate is zero), we show that stochastic fluctuations are conductive to dynamo. Furthermore, by considering the cases of fluctuating alpha that are periodic and Gaussian coloured random noise with identical characteristic time-scales and fluctuating amplitudes, we show that the transition to dynamo is significantly facilitated for stochastic alpha with random noise. Furthermore, we show that probability density functions (PDFs) of the growth-rate, magnetic field and magnetic energy can provide a wealth of useful information regarding the dynamo behaviour/intermittency. Finally, the precise statistical property of the dynamo such as temporal correlation and fluctuating amplitude is found to be dependent on the distribution the fluctuations of stochastic parameters. We then use observations of solar activity to constrain parameters relating to the effect in stochastic α-Ω nonlinear dynamo models. This is achieved through performing a comprehensive statistical comparison by computing PDFs of solar activity from observations and from our simulation of mean field dynamo model. The observational data that are used are the time history of solar activity inferred for C14 data in the past 11000 years on a long time scale and direct observations of the sun spot
Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L
2013-05-15
Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
Exclusion Statistics in Conformal Field Theory Spectra
International Nuclear Information System (INIS)
Schoutens, K.
1997-01-01
We propose a new method for investigating the exclusion statistics of quasiparticles in conformal field theory (CFT) spectra. The method leads to one-particle distribution functions, which generalize the Fermi-Dirac distribution. For the simplest SU(n) invariant CFTs we find a generalization of Gentile parafermions, and we obtain new distributions for the simplest Z N -invariant CFTs. In special examples, our approach reproduces distributions based on 'fractional exclusion statistics' in the sense of Haldane. We comment on applications to fractional quantum Hall effect edge theories. copyright 1997 The American Physical Society
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Energy Technology Data Exchange (ETDEWEB)
Yoshimoto, H. [Ship Research Inst., Tokyo (Japan)
1996-12-31
Since ocean waves encountered by ocean vessels or offshore structures in actual sea areas present extremely irregular variations, a stochastic method is necessary to estimate their statistical properties. This paper first shows a calculation method for probability density function for water level variation which strictly incorporates a secondary non-linear effect containing directional dispersibility by modeling ocean waves as short-crested irregular waves. Then, the paper specifically elucidates effects of the directional dispersibility of ocean waves on statistical amount of amplitudes by deriving the statistical amount of the amplitudes based on the probability density function of the water level variation and by using a numerical simulation. The paper finally takes up data of waves in stormy sea observed in an experiment in an actual sea area, compares the result with that of theoretical calculations, and evaluates reasonability of this method. With this estimation method, individual secondary components or components of difference and sum may be subjected to influence of the directional dispersibility, but they do not differ much from the case of long-crested irregular waves on the whole. 21 refs., 11 figs., 2 tabs.
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Sellaoui, Lotfi; Mechi, Nesrine; Lima, Éder Cláudio; Dotto, Guilherme Luiz; Ben Lamine, Abdelmottaleb
2017-10-01
Based on statistical physics elements, the equilibrium adsorption of diclofenac (DFC) and nimesulide (NM) on activated carbon was analyzed by a multilayer model with saturation. The paper aimed to describe experimentally and theoretically the adsorption process and study the effect of adsorbate size using the model parameters. From numerical simulation, the number of molecules per site showed that the adsorbate molecules (DFC and NM) were mostly anchored in both sides of the pore walls. The receptor sites density increase suggested that additional sites appeared during the process, to participate in DFC and NM adsorption. The description of the adsorption energy behavior indicated that the process was physisorption. Finally, by a model parameters correlation, the size effect of the adsorbate was deduced indicating that the molecule dimension has a negligible effect on the DFC and NM adsorption.
Directory of Open Access Journals (Sweden)
C A Abdul Shahariyar
2016-01-01
Full Text Available Objective: The aim of the current study was to determine the effects of casein phosphopeptide amorphous calcium-phosphate (CPP-ACP complex, chlorhexidine fluoride mouthwash on shear bond strengths (SBSs of orthodontic brackets. Materials and Methods: About sixty extracted healthy human premolar teeth with intact buccal enamel were divided into two equal groups to which brackets were bonded using self-etching primers (SEPs and conventional means respectively. These were further equally divided into three subgroups - (1 control (2 CPP-ACP (3 chlorhexidine fluoride mouthwash. The SBSs were then measured using a universal testing machine. Results: SBS of the conventional group was significantly higher than the self-etching group. The intragroup differences were statistically insignificant. Conclusion: CPP-ACP, chlorhexidine fluoride mouthwash did not adversely affect SBS of orthodontic brackets irrespective of the method of conditioning. Brackets bonded with conventional technique showed greater bond strengths as compared to those bonded with SEP.
Excel 2013 for physical sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching physical sciences statistics effectively. Similar to the previously published Excel 2010 for Physical Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their ...
Excel 2013 for social sciences statistics a guide to solving practical problems
Quirk, Thomas J
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach social science statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical social science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in social science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Social Science Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formul...
Excel 2016 for social science statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching social science statistics effectively. Similar to the previously published Excel 2013 for Social Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical social science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in social science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Social Science Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in ...
Gorobets, Yu I; Gorobets, O Yu
2015-01-01
The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.
Adarosy, H A; Gad, Y Z; El-Baz, S A; El-Shazly, A M
2013-04-01
Fascioliasis is an important food- and water-borne parasitic zoonosis caused by liver flukes of genus Fasciola (Digenea: Fasciolidae) of worldwide distribution. In Egypt, fascioliasis was encountered in nearly all Egyptian Governorates, particularly in the Nile Delta and specifically in Dakahlia. All enrolled cases were subjected to complete history taking, clinical examination, routine investigations and abdominal ultrasonography. Stool analysis, IHA and ELISA were used for fascioliasis diagnosis. Rural areas showed highest prevalence of fascioliasis than urban areas, however, but.without significance (x2= 0.042 & P= 0.837). Regarding human fascioliasis in examined the centers, no statistically significant difference (x2 =2.824 & P=0.243) was detected. Regarding gender variation, the difference was statistically insignificant (x2= 0.166 & P= 0.683). The difference between the age groups was statistically insignificant (x2= 3.882 & P=0.274). Clinically, 7 cases (35%) were asymptomatic and another 13 cases (65%) had different clinical pictures. Abdominal pain, anemia, eosinophilia, and tender hepatomegaly were seen in 70%, 80%, 70%, and 10%; respectively. Of them, 1 1cases showed positive abdominal ultrasonographic findings suggestive of fascioliasis.
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
International Nuclear Information System (INIS)
Picard, R.R.
1987-01-01
Verification of an inventory or of a reported material unaccounted for (MUF) calls for the remeasurement of a sample of items by an inspector followed by comparison of the inspector's data to the facility's reported values. Such comparison is intended to protect against falsification of accounting data that could conceal material loss. In the international arena, the observed discrepancies between the inspector's data and the reported data are quantified using the D statistic. If data have been falsified by the facility, the standard deviations of the D and MUF-D statistics are inflated owing to the sampling distribution. Moreover, under certain conditions the distributions of those statistics can depart markedly from normality, complicating evaluation of an inspection plan's performance. Detection probabilities estimated using standard deviations appropriate for the no-falsification case in conjunction with assumed normality can be far too optimistic. Under very general conditions regarding the facility's and/or the inspector's measurement error procedures and the inspector's sampling regime, the variance of the MUF-D statistic can be broken into three components. The inspection's sensitivity against various falsification scenarios can be traced to one or more of these components. Obvious implications exist for the planning of effective inspections, particularly in the area of resource optimization
The power and statistical behaviour of allele-sharing statistics when ...
Indian Academy of Sciences (India)
, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single ...
Fatigue of graphite/epoxy /0/90/45/-45/s laminates under dual stress levels
Yang, J. N.; Jones, D. L.
1982-01-01
A model for the prediction of loading sequence effects on the statistical distribution of fatigue life and residual strength in composite materials is generalized and applied to (0/90/45/-45)s graphite/epoxy laminates. Load sequence effects are found to be caused by both the difference in residual strength when failure occurs (boundary effect) and the effect of previously applied loads (memory effect). The model allows the isolation of these two effects, and the estimation of memory effect magnitudes under dual fatigue loading levels. It is shown that the material memory effect is insignificant, and that correlations between predictions of the number of early failures agree with the verification tests, as do predictions of fatigue life and residual strength degradation under dual stress levels.
Asymmetric beams and CMB statistical anisotropy
International Nuclear Information System (INIS)
Hanson, Duncan; Lewis, Antony; Challinor, Anthony
2010-01-01
Beam asymmetries result in statistically anisotropic cosmic microwave background (CMB) maps. Typically, they are studied for their effects on the CMB power spectrum, however they more closely mimic anisotropic effects such as gravitational lensing and primordial power asymmetry. We discuss tools for studying the effects of beam asymmetry on general quadratic estimators of anisotropy, analytically for full-sky observations as well as in the analysis of realistic data. We demonstrate this methodology in application to a recently detected 9σ quadrupolar modulation effect in the WMAP data, showing that beams provide a complete and sufficient explanation for the anomaly.
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
A Simplified Algorithm for Statistical Investigation of Damage Spreading
International Nuclear Information System (INIS)
Gecow, Andrzej
2009-01-01
On the way to simulating adaptive evolution of complex system describing a living object or human developed project, a fitness should be defined on node states or network external outputs. Feedbacks lead to circular attractors of these states or outputs which make it difficult to define a fitness. The main statistical effects of adaptive condition are the result of small change tendency and to appear, they only need a statistically correct size of damage initiated by evolutionary change of system. This observation allows to cut loops of feedbacks and in effect to obtain a particular statistically correct state instead of a long circular attractor which in the quenched model is expected for chaotic network with feedback. Defining fitness on such states is simple. We calculate only damaged nodes and only once. Such an algorithm is optimal for investigation of damage spreading i.e. statistical connections of structural parameters of initial change with the size of effected damage. It is a reversed-annealed method--function and states (signals) may be randomly substituted but connections are important and are preserved. The small damages important for adaptive evolution are correctly depicted in comparison to Derrida annealed approximation which expects equilibrium levels for large networks. The algorithm indicates these levels correctly. The relevant program in Pascal, which executes the algorithm for a wide range of parameters, can be obtained from the author.
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.
Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia
2012-11-23
In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.
Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials.
Potter, Christine E; Wang, Tianlin; Saffran, Jenny R
2017-04-01
Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, 6 months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, whereas both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. Copyright © 2016 Cognitive Science Society, Inc.
The Effect of Selected Factors on the Growth Ability of Charolais Cattle
Directory of Open Access Journals (Sweden)
Renáta Toušová
2014-01-01
Full Text Available The aim of this work was to analyze the growth abilities of bull and heifer calves (n = 190 of the Charolais breed calved in one herd from 2006 to 2011. The evaluation was carried out during the period of calf raising, focusing to the effects of sex, parity and the mating method on live birth weight (BLW and the live weight at 120 (LW120, 210 (LW210 and 365 (LW365 days of life, as well as the average daily weight gain reached by the age of 120 (G120, 210 (G210, and 365 (G365 days. Statistical analysis was carried out with the use of the SAS 9.3 program. The effect of sex on the live weight and average daily weight gain was statistically significant on the level of P < 0.01 always in favour of the bull calves (BLW +3.05 kg, LW120 +29.35 kg, LW210 +36.98 kg, and LW365 +117.23 kg. The lowest live birth weight was detected in the calves of primiparous cows (BLW = 45.46 kg; P < 0.01. The higher parity, the higher live birth weight was determined, with maximum values in cows from the fourth calving. The trends were similar in live weight at 120, 210 and 365 days of age, when significance level was P < 0.05. The parity effect on the average daily weight gain (G120, G210 a G365 was statistically insignificant (P > 0.05. The effect of embryo transfer (ET and other commonly used reproduction methods (insemination and natural mating, AI/PP was evaluated as well as. The calves born after embryo transfer showed a significantly (P < 0.01 higher the live birth weight as well as weight at different ages (BLW +5.85 kg, LW120 +18.15 kg, LW210 +22.94 kg and LW365 +35.43 kg, and G120 only (+100 g×day−1; P < 0.05. These results pointed the suitability of using the biotechnological reproduction methods, especially in relation to the total weight of the reared and fattened animals.
Helping Raise the Official Statistics Capability of Government Employees
Directory of Open Access Journals (Sweden)
Forbes Sharleen
2016-12-01
Full Text Available Both the production and the use of official statistics are important in the business of government. In New Zealand, concern persists about many government advisors’ low level of statistical capability. One programme designed specifically to enhance capability is New Zealand’s National Certificate of Official Statistics, first introduced in 2007 and originally targeted at government policy analysts and advisors. It now includes participants from many agencies, including the National Statistics Office. The competency-based 40-credit certificate comprises four taught units that aim to give students skills in basic official statistics and in critically evaluating statistical, research, policy, or media publications for their quality (of data, survey design, analysis, and conclusions and appropriateness for some policy issue (e.g., how to reduce problem gambling, together with an ‘umbrella’ workplace-based statistics project. Case studies are used to embed the statistics learning into the real-world context of these students. Several surveys of students and their managers were undertaken to evaluate the effectiveness of the certificate in terms of enhancing skill levels and meeting organisational needs and also to examine barriers to completion of the certificate. The results were used to both modify the programme and extend its international applicability.
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Effects of labour migration on economic development during economic downturn and recovery
Directory of Open Access Journals (Sweden)
Milan Palát
2012-01-01
Full Text Available International labour migration is mainly promoted by economic interests. This paper focuses on the period before and after the economic crisis and puts together important facts regarding motivation to labour migration and provides explanations of its causes and impacts on the macroeconomic level. The economic explanation why is migration so severely restricted is that migration policies are essentially distributive tools, aiming at reducing negative effects of migration on wages and unemployment among natives and moreover, we may stress out the gradualist tendencies of migration and such migration restrictions can mitigate supply-side shocks that may negatively affect incomes or jobs of some specific groups. A partial objective of the practical part of the paper is to evaluate relationships between the rate of migration and selected economic indicators using adequate quantitative methods. While the correlation between the crude rate of net migration and the GDP per capita is very low, the existence of correlation between the crude rate of net migration and the unemployment rate is evident in the most of analysed countries. Statistical insignificance of correlation indices in some countries can be then attributed to structural problems of those economies.
Export and Economic Growth in the West Balkan Countries
Directory of Open Access Journals (Sweden)
Florentina Xhelili Krasniqi
2017-09-01
Full Text Available The aim of this paper is to explore the effects of exports and other variables (foreign direct investment, remittances, capital formation, and labour force on economic growth in West Balkan countries (Albania, Kosovo, Macedonia, Montenegro, Bosnia and Herzegovina and Serbia. This study utilizes a strongly balanced panel data over the 2005-2015 period for Western Balkan countries using the ordinary least squares method (OLS, ie Pooled regression model to evaluate the parameters. The relationship between export and economic growth has turned to be statistically significant and positively related for the countries under the study. Results also indicate the statistically significant positive relationship between economic growth and other variables included in the model such is remittances, capital formation, and labor. The relationship between economic growth and foreign direct investment has turned out to be statistically insignificant and negatively related.
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...