PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
International Nuclear Information System (INIS)
2013-01-01
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
Statistical Power in Meta-Analysis
Liu, Jin
2015-01-01
Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…
Principles of Statistics: What the Sports Medicine Professional Needs to Know.
Riemann, Bryan L; Lininger, Monica R
2018-07-01
Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.
Does environmental data collection need statistics?
Pulles, M.P.J.
1998-01-01
The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental
Evaluating and Reporting Statistical Power in Counseling Research
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Electric power statistics from independence to establishment
International Nuclear Information System (INIS)
1997-02-01
This paper reports power statistics from independence to establishment pf KEPIC. It has the lists of electricity industry, electric equipment on the whole country power equipment at the independence and development of power facility, power generation about merit of power plants, demand according to types and use, power loss, charge for electric power distribution, power generation and generating cost, financial lists on income measurement and financing, meteorological phenomena and amount of rainfall electric power development, international statistics on major countries power generation and compare power rates with general price.
Statistical Power in Plant Pathology Research.
Gent, David H; Esker, Paul D; Kriss, Alissa B
2018-01-01
In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even
The relation between statistical power and inference in fMRI.
Directory of Open Access Journals (Sweden)
Henk R Cremers
Full Text Available Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects, and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial-especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20-30 display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate prediction methods and meta-analyses with related synthesis-oriented approaches.
The power and statistical behaviour of allele-sharing statistics when ...
Indian Academy of Sciences (India)
, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single ...
Environmental restoration and statistics: Issues and needs
International Nuclear Information System (INIS)
Gilbert, R.O.
1991-10-01
Statisticians have a vital role to play in environmental restoration (ER) activities. One facet of that role is to point out where additional work is needed to develop statistical sampling plans and data analyses that meet the needs of ER. This paper is an attempt to show where statistics fits into the ER process. The statistician, as member of the ER planning team, works collaboratively with the team to develop the site characterization sampling design, so that data of the quality and quantity required by the specified data quality objectives (DQOs) are obtained. At the same time, the statistician works with the rest of the planning team to design and implement, when appropriate, the observational approach to streamline the ER process and reduce costs. The statistician will also provide the expertise needed to select or develop appropriate tools for statistical analysis that are suited for problems that are common to waste-site data. These data problems include highly heterogeneous waste forms, large variability in concentrations over space, correlated data, data that do not have a normal (Gaussian) distribution, and measurements below detection limits. Other problems include environmental transport and risk models that yield highly uncertain predictions, and the need to effectively communicate to the public highly technical information, such as sampling plans, site characterization data, statistical analysis results, and risk estimates. Even though some statistical analysis methods are available ''off the shelf'' for use in ER, these problems require the development of additional statistical tools, as discussed in this paper. 29 refs
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Statistical power and the Rorschach: 1975-1991.
Acklin, M W; McDowell, C J; Orndoff, S
1992-10-01
The Rorschach Inkblot Test has been the source of long-standing controversies as to its nature and its psychometric properties. Consistent with behavioral science research in general, the concept of statistical power has been entirely ignored by Rorschach researchers. The concept of power is introduced and discussed, and a power survey of the Rorschach literature published between 1975 and 1991 in the Journal of Personality Assessment, Journal of Consulting and Clinical Psychology, Journal of Abnormal Psychology, Journal of Clinical Psychology, Journal of Personality, Psychological Bulletin, American Journal of Psychiatry, and Journal of Personality and Social Psychology was undertaken. Power was calculated for 2,300 statistical tests in 158 journal articles. Power to detect small, medium, and large effect sizes was .13, .56, and .85, respectively. Similar to the findings in other power surveys conducted on behavioral science research, we concluded that Rorschach research is underpowered to detect the differences under investigation. This undoubtedly contributes to the inconsistency of research findings which has been a source of controversy and criticism over the decades. It appears that research conducted according to the Comprehensive System for the Rorschach is more powerful. Recommendations are offered for improving power and strengthening the design sensitivity of Rorschach research, including increasing sample sizes, use of parametric statistics, reduction of error variance, more accurate reporting of findings, and editorial policies reflecting concern about the magnitude of relationships beyond an exclusive focus on levels of statistical significance.
Feng, Sheng; Wang, Shengchu; Chen, Chia-Cheng; Lan, Lan
2011-01-21
In designing genome-wide association (GWA) studies it is important to calculate statistical power. General statistical power calculation procedures for quantitative measures often require information concerning summary statistics of distributions such as mean and variance. However, with genetic studies, the effect size of quantitative traits is traditionally expressed as heritability, a quantity defined as the amount of phenotypic variation in the population that can be ascribed to the genetic variants among individuals. Heritability is hard to transform into summary statistics. Therefore, general power calculation procedures cannot be used directly in GWA studies. The development of appropriate statistical methods and a user-friendly software package to address this problem would be welcomed. This paper presents GWAPower, a statistical software package of power calculation designed for GWA studies with quantitative traits, where genetic effect is defined as heritability. Based on several popular one-degree-of-freedom genetic models, this method avoids the need to specify the non-centrality parameter of the F-distribution under the alternative hypothesis. Therefore, it can use heritability information directly without approximation. In GWAPower, the power calculation can be easily adjusted for adding covariates and linkage disequilibrium information. An example is provided to illustrate GWAPower, followed by discussions. GWAPower is a user-friendly free software package for calculating statistical power based on heritability in GWA studies with quantitative traits. The software is freely available at: http://dl.dropbox.com/u/10502931/GWAPower.zip.
Swiss solar power statistics 2007 - Significant expansion
International Nuclear Information System (INIS)
Hostettler, T.
2008-01-01
This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial
Statistical power analysis a simple and general model for traditional and modern hypothesis tests
Murphy, Kevin R; Wolach, Allen
2014-01-01
Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g
Hill, Timothy; Chocholek, Melanie; Clement, Robert
2017-06-01
Eddy covariance (EC) continues to provide invaluable insights into the dynamics of Earth's surface processes. However, despite its many strengths, spatial replication of EC at the ecosystem scale is rare. High equipment costs are likely to be partially responsible. This contributes to the low sampling, and even lower replication, of ecoregions in Africa, Oceania (excluding Australia) and South America. The level of replication matters as it directly affects statistical power. While the ergodicity of turbulence and temporal replication allow an EC tower to provide statistically robust flux estimates for its footprint, these principles do not extend to larger ecosystem scales. Despite the challenge of spatially replicating EC, it is clearly of interest to be able to use EC to provide statistically robust flux estimates for larger areas. We ask: How much spatial replication of EC is required for statistical confidence in our flux estimates of an ecosystem? We provide the reader with tools to estimate the number of EC towers needed to achieve a given statistical power. We show that for a typical ecosystem, around four EC towers are needed to have 95% statistical confidence that the annual flux of an ecosystem is nonzero. Furthermore, if the true flux is small relative to instrument noise and spatial variability, the number of towers needed can rise dramatically. We discuss approaches for improving statistical power and describe one solution: an inexpensive EC system that could help by making spatial replication more affordable. However, we note that diverting limited resources from other key measurements in order to allow spatial replication may not be optimal, and a balance needs to be struck. While individual EC towers are well suited to providing fluxes from the flux footprint, we emphasize that spatial replication is essential for statistically robust fluxes if a wider ecosystem is being studied. © 2016 The Authors Global Change Biology Published by John Wiley
DEFF Research Database (Denmark)
Ranaboldo, Matteo; Giebel, Gregor; Codina, Bernat
2013-01-01
A combination of physical and statistical treatments to post‐process numerical weather predictions (NWP) outputs is needed for successful short‐term wind power forecasts. One of the most promising and effective approaches for statistical treatment is the Model Output Statistics (MOS) technique....... The proposed MOS performed well in both wind farms, and its forecasts compare positively with an actual operative model in use at Risø DTU and other MOS types, showing minimum BIAS and improving NWP power forecast of around 15% in terms of root mean square error. Further improvements could be obtained...
Statistical testing and power analysis for brain-wide association study.
Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng
2018-04-05
The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.
Global status of nuclear power and the needed human resources
International Nuclear Information System (INIS)
Bernido, Corazon C.
2009-01-01
According to projections of the OECD/IEA, the world energy demand will expand by 45% from now until 2030, with coal accounting for more than a third of the overall rise. To reduce greenhouse gases and mitigate climate change, many countries are resorting to renewables and nuclear power. Some statistics about nuclear energy in the global energy mix and about nuclear power plants worldwide, as well as the energy situation in the country are presented. According to sources from the Department of Energy on the Philippine Energy Plan, nuclear power is a long-term energy option and will likely enter the energy mix by 2025. Preparation of the infrastructure for nuclear power has to start ten to fifteen years before the first plant comes online. The needed human resources, the education and training required are present. (Author)
A statistical estimator for the boiler power and its related parameters
International Nuclear Information System (INIS)
Tang, H.
2001-01-01
To determine the boiler power accurately is important for both controlling the plant and maximizing the plant productivity. There are two computed boiler powers for each boiler. They are steam based boiler power and feedwater based boiler power. The steam based boiler power is computed as the enthalpy difference between the feedwater enthalpy and the boiler steam enthalpy. The feedwater based boiler power is computed as enthalpy absorbed by the feedwater. The steam based boiler power is computed in RRS program and used in calibrating the measured reactor power, while the feedwater based boiler power is computed in CSTAT program and used for indication. Since the steam based boiler power is used as feedback in the reactor control, it is chosen to be the one estimated in this work. Because the boiler power employs steam flow, feedwater flow and feedwater temperature measurements, and because any measurement contains constant or drifting noise and bias, the reconciliation and rectification procedures are needed to determine the boiler power more accurately. A statistic estimator is developed to perform the function of data reconciliation, gross error detection and instruments performance monitoring
Directory of Open Access Journals (Sweden)
Jacobo Pardo-Seco
Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.
Statistical Power Analysis with Missing Data A Structural Equation Modeling Approach
Davey, Adam
2009-01-01
Statistical power analysis has revolutionized the ways in which we conduct and evaluate research. Similar developments in the statistical analysis of incomplete (missing) data are gaining more widespread applications. This volume brings statistical power and incomplete data together under a common framework, in a way that is readily accessible to those with only an introductory familiarity with structural equation modeling. It answers many practical questions such as: How missing data affects the statistical power in a study How much power is likely with different amounts and types
Availability statistics for thermal power plants
International Nuclear Information System (INIS)
1990-01-01
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)
Availability statistics for thermal power plants
International Nuclear Information System (INIS)
1989-01-01
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)
Prediction of lacking control power in power plants using statistical models
DEFF Research Database (Denmark)
Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob
2007-01-01
Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...... errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance...
Replication unreliability in psychology: elusive phenomena or elusive statistical power?
Directory of Open Access Journals (Sweden)
Patrizio E Tressoldi
2012-07-01
Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.
The power and robustness of maximum LOD score statistics.
Yoo, Y J; Mendell, N R
2008-07-01
The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value. In this paper, we compare the power of maximum LOD scores based on three fixed sets of genetic parameter values with the power of the LOD score obtained after maximizing over the entire range of genetic parameter values. We simulate family data under nine generating models. For generating models with non-zero phenocopy rates, LOD scores maximized over the entire range of genetic parameters yielded greater power than maximum LOD scores for fixed sets of parameter values with zero phenocopy rates. No maximum LOD score was consistently more powerful than the others for generating models with a zero phenocopy rate. The power loss of the LOD score maximized over the entire range of genetic parameters, relative to the maximum LOD score calculated using the correct genetic parameter value, appeared to be robust to the generating models.
Directory of Open Access Journals (Sweden)
Dagna Kocur
2017-12-01
Full Text Available Background The purpose of the study was to examine the phenomenon of power within an organisation from the vantage point of gender, the occupied position, earnings, and the number of subordinates. Participants and procedure The sample group comprised 107 female and 98 male participants. The mean age was 42.14 years (SD = 11.73. The study covered 100 superiors and 105 subordinates. The research tools were: the Need for Power and Influence Questionnaire (Bennett, 1988, the Personal Sense of Power Scale (Anderson, John, & Keltner, 2012, and the Directiveness Scale SD (Ray, 1976. Results The superiors scored significantly higher on the need for power, need for influence, and directiveness. They also scored higher in terms of the need for power in relations with other people, with colleagues, and in superior-subordinate relations. The number of male leaders was conspicuously greater than the number of female leaders. Furthermore, women had fewer subordinates than men and earned less than men. Female participants scored lower on the sense of power and the need for power scales. Conclusions Occupying either an executive or subordinate position differentiates between women and men in terms of sense of power in interpersonal relationships. The findings on sense of power in the professional context may be applied in organisational psychology in order to increase employees’ competence and qualifications.
Statistical modeling to support power system planning
Staid, Andrea
This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate
Statistical Power in Longitudinal Network Studies
Stadtfeld, Christoph; Snijders, Tom A. B.; Steglich, Christian; van Duijn, Marijtje
2018-01-01
Longitudinal social network studies may easily suffer from a lack of statistical power. This is the case in particular for studies that simultaneously investigate change of network ties and change of nodal attributes. Such selection and influence studies have become increasingly popular due to the
How many subjects do I need to power my study?
Directory of Open Access Journals (Sweden)
Sergio R. Muñoz Navarro
2014-07-01
Full Text Available The article presents a tool that helps answer the question “How many subjects do I need to power my study? We show how to determine sample size in observational epidemiological studies and provide examples of application using the statistical package Epidat, which is a shareware program developed under the auspices of the Pan American Health Organization, the Galician Board of Health and the University CES of Colombia. Examples of calculation of sample size for prevalence studies (cross-sectional, case-control studies and cohort studies are given.
International Nuclear Information System (INIS)
Chern, W.S.; Just, R.E.
1982-01-01
The growing controversy over nuclear power has demanded a critical evaluation of the need for power to justify proposed nuclear power plants. This paper discusses the use of an econometric model developed for the US Nuclear Regulatory Commission to conduct an independent assessment of electricity demand forecasts related to the licensing of nuclear power plants. The model forecasts electricity demand and price by sector and by state. The estimation and forecasting results for the New England region are presented as a case in point where an econometric model has been used to analyse alternative fuel price scenarios and to aid substantive public decision making regarding new nuclear power plant decisions. (author)
Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows
DEFF Research Database (Denmark)
Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco
2012-01-01
In view of the increasing penetration of wind power in a number of power systems and markets worldwide, we discuss some of the impacts that wind energy may have on market quantities and cross-border power flows. These impacts are uncovered through statistical analyses of actual market and flow data...... of load and wind power forecasts on Danish and German electricity markets....
Statistical operation of nuclear power plants
International Nuclear Information System (INIS)
Gauzit, Maurice; Wilmart, Yves
1976-01-01
A comparison of the statistical operating results of nuclear power stations as issued in the literature shows that the values given for availability and the load factor often differ considerably from each other. This may be due to different definitions given to these terms or even to a poor translation from one language into another. A critical analysis of these terms as well as the choice of a parameter from which it is possible to have a quantitative idea of the actual quality of the operation obtained is proposed. The second section gives, on an homogenous basis and from the results supplied by 83 nuclear power stations now in operation, a statistical analysis of their operating results: in particular, the two light water lines, during 1975, as well as the evolution in terms of age, of the units or the starting conditions of the units during their first two operating years. Test values thus obtained are compared also to those taken 'a priori' as hypothesis in some economic studies [fr
Statistical studies of powerful extragalactic radio sources
Energy Technology Data Exchange (ETDEWEB)
Macklin, J T
1981-01-01
This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.
Low statistical power in biomedical science: a review of three human research domains
Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois
2017-01-01
Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409
Statistical modelling of space-time processes with application to wind power
DEFF Research Database (Denmark)
Lenzi, Amanda
. This thesis aims at contributing to the wind power literature by building and evaluating new statistical techniques for producing forecasts at multiple locations and lead times using spatio-temporal information. By exploring the features of a rich portfolio of wind farms in western Denmark, we investigate...... propose spatial models for predicting wind power generation at two different time scales: for annual average wind power generation and for a high temporal resolution (typically wind power averages over 15-min time steps). In both cases, we use a spatial hierarchical statistical model in which spatial...
Voet, van der H.; Goedhart, P.W.
2015-01-01
Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation
National power grid simulation capability : need and issues
Energy Technology Data Exchange (ETDEWEB)
Petri, Mark C. [Argonne National Lab. (ANL), Argonne, IL (United States)
2009-06-02
On December 9 and 10, 2008, the Department of Homeland Security (DHS) Science and Technology Directorate sponsored a national workshop at Argonne National Laboratory to explore the need for a comprehensive modeling and simulation capability for the national electric power grid system. The workshop brought together leading electric power grid experts from federal agencies, the national laboratories, and academia to discuss the current state of power grid science and engineering and to assess if important challenges are being met. The workshop helped delineate gaps between grid needs and current capabilities and identify issues that must be addressed if a solution is to be implemented. This report is a result of the workshop and highlights power grid modeling and simulation needs, the barriers that must be overcome to address them, and the benefits of a national power grid simulation capability.
Conducting need-for-power review for nuclear power plants: guidelines to states. Draft report
International Nuclear Information System (INIS)
Nash, D.A.
1982-12-01
The report is intended to describe the state regulatory commissions and other state agencies the standards and criteria used by NRC in conducting need-for-power evaluations for the licensing of nuclear power plants. These are intended as guidelines to states which may wish to perform a need-for-power review that will suffice for adoption by the NRC in its licensing process. Three methodologies which have been used for need-for-power evaluations and which meet NRC standards are included
The issue of statistical power for overall model fit in evaluating structural equation models
Directory of Open Access Journals (Sweden)
Richard HERMIDA
2015-06-01
Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.
Needs and Possibility of Involving Nuclear Power Plant in the Macedonian Power System
International Nuclear Information System (INIS)
Bosevski, T.; Causevski, A.
1998-01-01
The Macedonian Power System (MPS) used to be a part of the former Yugoslav Power System, and it was connected to the European system by 400 kV transmission lines. At the present time, the MPS works isolated from the UCPTE, only connected to the Yugoslav and Greek power systems. The connections with the Bulgarian and Albanian power systems are on a lower voltage level. The reliability and stability of the MPS needs to be improved. Macedonia is located in the central area of the Balkan, where the transmission systems from other Balkan countries are crossing. in the near future, the Macedonian Power System needs to be linked to the European system. To prepare for the energy demand at the beginning of the 21-st century, when the local coal reserves get exhausted, Macedonia needs to start with activities for substitution of the existing coal-fired thermal power plants with nuclear plants. This paper discusses the activities for global development solutions in the area of power generation. (author)
The statistical power to detect cross-scale interactions at macroscales
Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.
2016-01-01
Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.
Statistic method of research reactors maximum permissible power calculation
International Nuclear Information System (INIS)
Grosheva, N.A.; Kirsanov, G.A.; Konoplev, K.A.; Chmshkyan, D.V.
1998-01-01
The technique for calculating maximum permissible power of a research reactor at which the probability of the thermal-process accident does not exceed the specified value, is presented. The statistical method is used for the calculations. It is regarded that the determining function related to the reactor safety is the known function of the reactor power and many statistically independent values which list includes the reactor process parameters, geometrical characteristics of the reactor core and fuel elements, as well as random factors connected with the reactor specific features. Heat flux density or temperature is taken as a limiting factor. The program realization of the method discussed is briefly described. The results of calculating the PIK reactor margin coefficients for different probabilities of the thermal-process accident are considered as an example. It is shown that the probability of an accident with fuel element melting in hot zone is lower than 10 -8 1 per year for the reactor rated power [ru
Competent statistical programmer: Need of business process outsourcing industry
Khan, Imran
2014-01-01
Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes. PMID:24987578
Competent statistical programmer: Need of business process outsourcing industry.
Khan, Imran
2014-07-01
Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.
Competent statistical programmer: Need of business process outsourcing industry
Directory of Open Access Journals (Sweden)
Imran Khan
2014-01-01
Full Text Available Over the last two decades Business Process Outsourcing (BPO has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Statistical power as a function of Cronbach alpha of instrument questionnaire items.
Heo, Moonseong; Kim, Namhee; Faith, Myles S
2015-10-14
In countless number of clinical trials, measurements of outcomes rely on instrument questionnaire items which however often suffer measurement error problems which in turn affect statistical power of study designs. The Cronbach alpha or coefficient alpha, here denoted by C(α), can be used as a measure of internal consistency of parallel instrument items that are developed to measure a target unidimensional outcome construct. Scale score for the target construct is often represented by the sum of the item scores. However, power functions based on C(α) have been lacking for various study designs. We formulate a statistical model for parallel items to derive power functions as a function of C(α) under several study designs. To this end, we assume fixed true score variance assumption as opposed to usual fixed total variance assumption. That assumption is critical and practically relevant to show that smaller measurement errors are inversely associated with higher inter-item correlations, and thus that greater C(α) is associated with greater statistical power. We compare the derived theoretical statistical power with empirical power obtained through Monte Carlo simulations for the following comparisons: one-sample comparison of pre- and post-treatment mean differences, two-sample comparison of pre-post mean differences between groups, and two-sample comparison of mean differences between groups. It is shown that C(α) is the same as a test-retest correlation of the scale scores of parallel items, which enables testing significance of C(α). Closed-form power functions and samples size determination formulas are derived in terms of C(α), for all of the aforementioned comparisons. Power functions are shown to be an increasing function of C(α), regardless of comparison of interest. The derived power functions are well validated by simulation studies that show that the magnitudes of theoretical power are virtually identical to those of the empirical power. Regardless
The need for nuclear power in Indonesia
International Nuclear Information System (INIS)
Subki, I.R.; Arbie, B.; Adiwardojo, M.S.; Tobing, M.L.
2000-01-01
Nuclear power generation is a well-proven technology for electricity production. World-wide, in both developed and developing countries, by mid May 1997, 443 Nuclear Power Plants (NPPs) have been in operation contributing around 18% to the world electricity supply with a total generating capacity of 351 GWe in 32 countries. There are 35 NPPs now under construction in 14 countries. Now, most of us have come to realize that an increasing demand and supply of energy is a reality and a necessity to support socio-economic development. This is especially true in developing countries where most of the population have a low consumption of energy and a low standard of living, and the need for a lot of energy to fuel the development and to improve the quality of life is imminent. In regard to electricity supply, this situation can be translated into the need for a large base load power generation. The electricity demand in Indonesia is very high due to the National Economic Development Plan based on industrialization and supported by a strong agriculture base. This situation calls for development and deployment of all energy technologies, including nuclear, fossil and renewables, to supply the energy needed. The need for nuclear power in Indonesia is in line with the national energy policy, which stresses diversification and conservation, economic competitiveness, and environmental cleanliness. The prepared Nuclear Science and Technology Base and its potential to support the high-tech industry development will lead Indonesia to a sustainable national development. (author)
Forecasting winds over nuclear power plants statistics
International Nuclear Information System (INIS)
Marais, Ch.
1997-01-01
In the event of an accident at nuclear power plant, it is essential to forecast the wind velocity at the level where the efflux occurs (about 100 m). At present meteorologists refine the wind forecast from the coarse grid of numerical weather prediction (NWP) models. The purpose of this study is to improve the forecasts by developing a statistical adaptation method which corrects the NWP forecasts by using statistical comparisons between wind forecasts and observations. The Multiple Linear Regression method is used here to forecast the 100 m wind at 12 and 24 hours range for three Electricite de France (EDF) sites. It turns out that this approach gives better forecasts than the NWP model alone and is worthy of operational use. (author)
Oil exporting countries need nuclear power
International Nuclear Information System (INIS)
Stauffer, T.R.
1982-01-01
The economic rationale for nuclear power in the oil exporting countries is analysed, with the collateral objective of defining the size of the potential market in terms of the exporting countries' economic opportunities and energy needs. The need for appropriate new institutions for licensing reactors, training personnel, and starting up plants follows directly from the size of the market and the economic incentives for the oil exporters to husband gas and oil. Gas and oil resources of the Middle Eastern countries are discussed, and future electricity needs estimated. (author)
Directory of Open Access Journals (Sweden)
R. Eric Heidel
2016-01-01
Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Statistical power of model selection strategies for genome-wide association studies.
Directory of Open Access Journals (Sweden)
Zheyang Wu
2009-07-01
Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the
Statistical analysis with Excel for dummies
Schmuller, Joseph
2013-01-01
Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro
International Nuclear Information System (INIS)
Kangas, H.
2001-01-01
The frost in February increased the power demand in Finland significantly. The total power consumption in Finland during January-February 2001 was about 4% higher than a year before. In January 2001 the average temperature in Finland was only about - 4 deg C, which is nearly 2 degrees higher than in 2000 and about 6 degrees higher than long term average. Power demand in January was slightly less than 7.9 TWh, being about 0.5% less than in 2000. The power consumption in Finland during the past 12 months exceeded 79.3 TWh, which is less than 2% higher than during the previous 12 months. In February 2001 the average temperature was - 10 deg C, which was about 5 degrees lower than in February 2000. Because of this the power consumption in February 2001 increased by 5%. Power consumption in February was 7.5 TWh. The maximum hourly output of power plants in Finland was 13310 MW. Power consumption of Finnish households in February 2001 was about 10% higher than in February 2000, and in industry the increase was nearly zero. The utilization rate in forest industry in February 2001 decreased from the value of February 2000 by 5%, being only about 89%. The power consumption of the past 12 months (Feb. 2000 - Feb. 2001) was 79.6 TWh. Generation of hydroelectric power in Finland during January - February 2001 was 10% higher than a year before. The generation of hydroelectric power in Jan. - Feb. 2001 was nearly 2.7 TWh, corresponding to 17% of the power demand in Finland. The output of hydroelectric power in Finland during the past 12 months was 14.7 TWh. The increase from the previous 12 months was 17% corresponding to over 18% of the power demand in Finland. Wind power generation in Jan. - Feb. 2001 was exceeded slightly 10 GWh, while in 2000 the corresponding output was 20 GWh. The degree of utilization of Finnish nuclear power plants in Jan. - Feb. 2001 was high. The output of these plants was 3.8 TWh, being about 1% less than in Jan. - Feb. 2000. The main cause for the
Dyscalculia, dyslexia, and medical students' needs for learning and using statistics.
MacDougall, Margaret
2009-02-07
Much has been written on the learning needs of dyslexic and dyscalculic students in primary and early secondary education. However, it is not clear that the necessary disability support staff and specialist literature are available to ensure that these needs are being adequately met within the context of learning statistics and general quantitative skills in the self-directed learning environments encountered in higher education. This commentary draws attention to dyslexia and dyscalculia as two potentially unrecognized conditions among undergraduate medical students and in turn, highlights key developments from recent literature in the diagnosis of these conditions. With a view to assisting medical educators meet the needs of dyscalculic learners and the more varied needs of dyslexic learners, a comprehensive list of suggestions is provided as to how learning resources can be designed from the outset to be more inclusive. A hitherto neglected area for future research is also identified through a call for a thorough investigation of the meaning of statistical literacy within the context of the undergraduate medical curriculum.
Dyscalculia, Dyslexia, and Medical Students’ Needs for Learning and Using Statistics
MacDougall, Margaret
2009-01-01
Much has been written on the learning needs of dyslexic and dyscalculic students in primary and early secondary education. However, it is not clear that the necessary disability support staff and specialist literature are available to ensure that these needs are being adequately met within the context of learning statistics and general quantitative skills in the self-directed learning environments encountered in higher education. This commentary draws attention to dyslexia and dyscalculia as two potentially unrecognized conditions among undergraduate medical students and in turn, highlights key developments from recent literature in the diagnosis of these conditions. With a view to assisting medical educators meet the needs of dyscalculic learners and the more varied needs of dyslexic learners, a comprehensive list of suggestions is provided as to how learning resources can be designed from the outset to be more inclusive. A hitherto neglected area for future research is also identified through a call for a thorough investigation of the meaning of statistical literacy within the context of the undergraduate medical curriculum. PMID:20165516
Statistical Power of Psychological Research: What Have We Gained in 20 Years?
Rossi, Joseph S.
1990-01-01
Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…
Development and testing of improved statistical wind power forecasting methods.
Energy Technology Data Exchange (ETDEWEB)
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Effect size, confidence intervals and statistical power in psychological research.
Directory of Open Access Journals (Sweden)
Téllez A.
2015-07-01
Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.
SIESE - trimestrial bulletin - Synthesis 1995. Electric power summary statistics for Brazil
International Nuclear Information System (INIS)
1995-01-01
This bulletin presents the electric power summary statistics, which cover the performance of the power system for the whole of the utilities in 1995. It offers tables with revised data concerning the last two years based on updated information supplied by both the electric utilities and the SIESE's responsibility centers. 6 figs., 36 tabs
Austin, Peter C
2018-01-01
The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.
Directory of Open Access Journals (Sweden)
Chaeyoung Lee
2012-11-01
Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.
Wind Power Statistics Sweden 2009; Vindkraftstatistik 2009
Energy Technology Data Exchange (ETDEWEB)
2010-04-15
In 2009, wind power produced 2.5 TWh, an increase of 26 percent over the previous year. Throughout the period 2003-2009 has production of electricity from wind power almost quadrupled. Sweden's total net production of electricity amounted, according to provisional statistics for 2009, to 133.7 TWh. The year 2007 wind energy's share passed 1.0 percent of total net production of electricity for the first time. In 2008 the proportion was 1.4 percent, and in 2009 to almost 1.9 percent of total net production. Total installed power 2009 was 1448 MW and the number of plants was 1359, an inckW{sub pse} with 363 MW and 198 resp. from 2008. In 2009, there were three main support system for wind power in Sweden: the certificate system; the wind pilot project; and the environmental bonus. The electricity certificate system is a market-based support system for electricity generation from renewables which includes wind power as one of the approved techniques. The system was introduced in 2003 and aims to increase the production of electricity from renewable energy sources by 25 TWh from 2002 levels by 2020.. Wind pilot support is a support to the market for large-scale wind power. Support aims to reduce the cost of the creation of new wind energy and promoting new technologies. Wind Pilot Aid, which has existed since 2003, has been extended until in 2012 and has increased by 350 million SEK (about 36 M Euro) for the period 2008-2012. The environmental bonus, which means a tax subsidy, has been stepped down for each year until and by the year 2009, which was the last year. In 2009, environmental bonus was 0.12 SEK/kWh for electricity from offshore wind. For onshore wind power the environmentally bonus ceased in 2008
Ten-year statistics of the electric power supply. Status and tendencies
International Nuclear Information System (INIS)
2001-12-01
The ten-year statistics of the electric power supply in Denmark for 1991-2000 presents in tables and figures the trend of the electric power supply sector during the last ten years. The tables and figures present information on total energy consumption, combined heat and power generation, fuel consumption and the environment, the technical systems, economy and pricing, organization of the electricity supply, and information on electricity prices and taxes for households and industry in various countries. (LN)
Ten-year statistics of the electric power supply. Status and tendencies
International Nuclear Information System (INIS)
2000-12-01
The ten-year statistics of the electric power supply in Denmark for 1990-1999 presents in tables and figures the trend of the electric power supply sector during the last ten years. The tables and figures present information on total energy consumption, combined heat and power generation, fuel consumption and the environment, the technical systems, economy and pricing, organization of the electricity supply, auto-production of electricity and information on electricity prices and taxes for households and industry in various countries. (LN)
Model output statistics applied to wind power prediction
Energy Technology Data Exchange (ETDEWEB)
Joensen, A; Giebel, G; Landberg, L [Risoe National Lab., Roskilde (Denmark); Madsen, H; Nielsen, H A [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)
1999-03-01
Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.
Statistical analysis of human maintenance failures of a nuclear power plant
International Nuclear Information System (INIS)
Pyy, P.
2000-01-01
In this paper, a statistical study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components are especially prone to human failures. Many human failures were found in safety related systems. Similarly, several failures remained latent from outages to power operation. The safety significance was generally small. Modifications are an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more, in future. (orig.)
Space power needs and forecasted technologies for the 1990s and beyond
International Nuclear Information System (INIS)
Buden, D.; Albert, T.
1987-01-01
A new generation of reactors for electric power will be available for space missions to satisfy military and civilian needs in the 1990s and beyond. To ensure a useful product, nuclear power plant development must be cognizant of other space power technologies. Major advances in solar and chemical technologies need to be considered in establishing the goals of future nuclear power plants. In addition, the mission needs are evolving into new regimes. Civilian and military power needs are forecasted to exceed anything used in space to date. Technology trend forecasts have been mapped as a function of time for solar, nuclear, chemical, and storage systems to illustrate areas where each technology provides minimum mass. Other system characteristics may dominate the usefulness of a technology on a given mission. This paper will discuss some of these factors, as well as forecast future military and civilian power needs and the status of technologies for the 1990s and 2000s. 6 references
Dagna Kocur; Eugenia Mandal
2017-01-01
Background The purpose of the study was to examine the phenomenon of power within an organisation from the vantage point of gender, the occupied position, earnings, and the number of subordinates. Participants and procedure The sample group comprised 107 female and 98 male participants. The mean age was 42.14 years (SD = 11.73). The study covered 100 superiors and 105 subordinates. The research tools were: the Need for Power and Influence Questionnaire (Bennett, 1988), th...
Site acceptability and power availability: needed institutional changes
International Nuclear Information System (INIS)
Haggard, J.E.
1975-01-01
Timely assurance of power plant site availability is threatened by institutional inabilities to resolve often competing environmental/energy requirements. Institutional changes are needed. The issue of site approval should be separated from that of plant approval. A ''one-stop'' forum for site approval, modeled after Washington State's Thermal Power Plant Siting Act, is needed. The one-stop process utilizes one forum composed of officials drawn from all agencies involved in site related issues. A joint Federal/State Siting Council, with sole jurisdiction over site approval, is recommended. The State Council would have a determinative vote on all issues not otherwise preempted by federal legislation. 21 references. (U.S.)
Renewable Energy Resources: Solutions to Nigeria power and energy needs
International Nuclear Information System (INIS)
Ladan-Haruna, A.
2011-01-01
Power and energy, with particularly electricity remains the pivot of economical and social development of any country. In view of this fact, a research on how renewable energy resources can solve Nigeria power and energy needs was carried out. It has identified main issues such as inconsistence government policies, corruptions and lack of fund hindering the development of renewable and power sectors for sustainable energy supply. The capacity of alternative energy resources and technology [hydropower, wind power, biomass, photovoltaic (solar), and geothermal power] to solve Nigerian energy crisis cannot be over-emphasized as some countries of the world who have no petroleum resources, utilizes other alternatives or options to solves their power and energy requirement. This paper reviews the prospects, challenges and solutions to Nigeria energy needs using renewable sources for development as it boost industrialization and create job opportunities
Effect size and statistical power in the rodent fear conditioning literature - A systematic review.
Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B
2018-01-01
Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.
[Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].
Suzukawa, Yumi; Toyoda, Hideki
2012-04-01
This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.
Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.
1992-11-01
The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will
International Nuclear Information System (INIS)
Anon.
1985-01-01
SA will have to build more nuclear power stations over the next 30 years if the change over from coal-fired stations is to be made successfully. There will have to be substantial growth in nuclear power. If new nuclear power stations are to be built it is likely they are to be on the coast. Studies of the existing and projected population density of the area and the infrastructure have to be done. The next nuclear power stations is likely to use the light water mounted and cooled fission reactor. The present situation with the Koeberg nuclear power plant is also discussed
Statistical algorithm for automated signature analysis of power spectral density data
International Nuclear Information System (INIS)
Piety, K.R.
1977-01-01
A statistical algorithm has been developed and implemented on a minicomputer system for on-line, surveillance applications. Power spectral density (PSD) measurements on process signals are the performance signatures that characterize the ''health'' of the monitored equipment. Statistical methods provide a quantitative basis for automating the detection of anomalous conditions. The surveillance algorithm has been tested on signals from neutron sensors, proximeter probes, and accelerometers to determine its potential for monitoring nuclear reactors and rotating machinery
A Note on Comparing the Power of Test Statistics at Low Significance Levels.
Morris, Nathan; Elston, Robert
2011-01-01
It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.
DEFF Research Database (Denmark)
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...
Energy Technology Data Exchange (ETDEWEB)
Pestana, Rui [Rede Electrica Nacional (REN), S.A., Lisboa (Portugal). Dept. Systems and Development System Operator; Trancoso, Ana Rosa; Delgado Domingos, Jose [Univ. Tecnica de Lisboa (Portugal). Seccao de Ambiente e Energia
2012-07-01
Accurate wind power forecast are needed to reduce integration costs in the electric grid caused by wind inherent variability. Currently, Portugal has a significant wind power penetration level and consequently the need to have reliable wind power forecasts at different temporal scales, including localized events such as ramps. This paper provides an overview of the methodologies used by REN to forecast wind power at national level, based on statistical and probabilistic combinations of NWP and measured data with the aim of improving accuracy of pure NWP. Results show that significant improvement can be achieved with statistical combination with persistence in the short-term and with probabilistic combination in the medium-term. NWP are also able to detect ramp events with 3 day notice to the operational planning. (orig.)
Effect size and statistical power in the rodent fear conditioning literature – A systematic review
Macleod, Malcolm R.
2018-01-01
Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451
Jeffrey P. Prestemon
2009-01-01
Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...
Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance
Colombo, Matteo; Bucher, Leandra; Sprenger, Jan
2017-01-01
Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i) the prior credibility of an explanatory hypothesis, (ii) the causal framing of the hypothesis, (iii) the perceived generalizability of the explanation, and (iv) the relation of statistical relevance between hypothesis and evidence. Collectively, the results of our five experiments support the hypothesis that the prior credibility of a causal explanation plays a central role in explanatory reasoning: first, because of the presence of strong main effects on judgments of explanatory power, and second, because of the gate-keeping role it has for other factors. Highly credible explanations are not susceptible to causal framing effects, but they are sensitive to the effects of normatively relevant factors: the generalizability of an explanation, and its statistical relevance for the evidence. These results advance current literature in the philosophy and psychology of explanation in three ways. First, they yield a more nuanced understanding of the determinants of judgments of explanatory power, and the interaction between these factors. Second, they show the close relationship between prior beliefs and explanatory power. Third, they elucidate the nature of abductive reasoning. PMID:28928679
Statistical tests for power-law cross-correlated processes
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
A Statistical Method for Aggregated Wind Power Plants to Provide Secondary Frequency Control
DEFF Research Database (Denmark)
Hu, Junjie; Ziras, Charalampos; Bindner, Henrik W.
2017-01-01
curtailment for aggregated wind power plants providing secondary frequency control (SFC) to the power system. By using historical SFC signals and wind speed data, we calculate metrics for the reserve provision error as a function of the scheduled wind power. We show that wind curtailment can be significantly......The increasing penetration of wind power brings significant challenges to power system operators due to the wind’s inherent uncertainty and variability. Traditionally, power plants and more recently demand response have been used to balance the power system. However, the use of wind power...... as a balancing-power source has also been investigated, especially for wind power dominated power systems such as Denmark. The main drawback is that wind power must be curtailed by setting a lower operating point, in order to offer upward regulation. We propose a statistical approach to reduce wind power...
The Power Makers' Challenge And the Need for Fission Energy
Nicholson, Martin
2012-01-01
The Power Makers - the producers of our electricity - must meet the demands of their customers while also addressing the threat of climate change. There are widely differing views about solutions to electricity generation in an emission constrained world. Some see the problem as relatively straight forward, requiring deep cuts in emissions now by improving energy efficiency, energy conservation and using only renewable resources. Many electricity industry engineers and scientists see the problem as being much more involved. The Power Makers ’ Challenge: and the need for Fission Energy looks at why using only conventional renewable energy sources is not quite as simple as it seems. Following a general introduction to electricity and its distribution, the author quantifies the reductions needed in greenhouse gas emissions from the power sector in the face of ever increasing world demands for electricity. It provides some much needed background on the many energy sources available for producing electricity ...
Directory of Open Access Journals (Sweden)
Karien Jooste
2017-10-01
Full Text Available Motivation is a process that influences and directs behaviour in order to satisfy a need. It links with goal 3 of the sustainable development goals that focus on ensuring healthy lives and promoting well-being at all ages. Motivation of nurses is important in the primary health care environment of, for instance, mine settings; since low levels of motivation among Primary Health Care (PHC nurses could have a negative effect on the achievement of high standards in health service delivery. The study was conducted within the theoretical framework of McClelland's Acquired Motivation Theory which consists of three basic needs, – the need for achievement, the need for power, and the need for affiliation. One of the research questions posed was “What are the motivational needs of PHC nurses to acquire power in the workplace at mine clinic settings?” A quantitative, explorative, descriptive design was followed. The accessible population in this study was PHC nurses (N = 30 working at 13 mine clinics, that also served as the total sample. A 7 point Likert scale was used in a self-administered structured questionnaire that was developed from a literature review. Ethical considerations were adhered to and respondents gave written informed consent. Data was analysed by using descriptive and inferential statistics. TheManne Whitney test compared the mean ranks and a p-value of p < 0.05 was indicative of a significant difference between male and female groups. Validity and reliability principles were applied during the entire research process. The results indicated that PHC nurses needed acknowledgement, organisational responsibility, strategic planning and promotion, as well as support. Significant differences between gender were not found in relation to the need to acquire power.
Magnetic storm effects in electric power systems and prediction needs
Albertson, V. D.; Kappenman, J. G.
1979-01-01
Geomagnetic field fluctuations produce spurious currents in electric power systems. These currents enter and exit through points remote from each other. The fundamental period of these currents is on the order of several minutes which is quasi-dc compared to the normal 60 Hz or 50 Hz power system frequency. Nearly all of the power systems problems caused by the geomagnetically induced currents result from the half-cycle saturation of power transformers due to simultaneous ac and dc excitation. The effects produced in power systems are presented, current research activity is discussed, and magnetic storm prediction needs of the power industry are listed.
Information needs in nuclear power plants during low power operation modes
Energy Technology Data Exchange (ETDEWEB)
Tommila, Teemu; Fantoni, Paolo F.; Zander, Ralf M.
1998-02-01
During the past few years an increasing attention has been paid to the safety of shutdown and refuelling operations. It has turned out that the risks during shutdown may be comparable to the risks of power operation. The goal of this report is to identify information requirements related to low power operating modes of nuclear power plants. These include, for example, warm and cold shutdowns, refuelling and maintenance, as well as related state transitions such as start-up and shut-down. The focus of the report is on planned refuelling outages and the role of the control room in managing the outage activities. As a starting point, the basic terminology and characteristics of low power operation are discussed. The current situation at nuclear power plants and some recent developments in information technology are reviewed. End-users' requirements and enabling technologies are combined in order to identify the opportunities for new information technology tools in low power operation. The required features of process control systems and maintenance information systems are described. Common plant modelling techniques, open software architectures and functional structuring of the process control system are suggested to be the key issues in the long-term development of operator support systems. On a shorter time scale, new tools solving limited practical problems should be developed and evaluated. This would provide a basis for the features needed for low power operation, including for example, outage planning, on-line risk monitoring, management of outage tasks, adaptive alarm handling, computerised procedures and task-oriented human interfaces. (author)
Estimation of the electrical power needed for LHC magnets and radiofrequency at 7 TeV
Thiesen, H; Burnet, J P
2012-01-01
The purpose of this paper is to provide the electrical power needed from the grid for the power converters feeding the magnets (superconducting, warm and experiments) and the radiofrequency of the LHC. At 4 TeV, the active power required for the magnets is 17.6MW and the estimation is 25.5MW at 7 TeV. The active power needed for the radiofrequency depends on the beam intensity and on the bunch spacing. It will grow from 7MW to 10MW with 25ns bunch spacing operation. This does not include the power needed for the cryogenic and magnet auxiliary systems. This paper gives also the instantaneous profile of the power needed from the grid during the ramp and the reactive power which needs to be compensated by the static VAR compensators.
Air-chemistry "turbulence": power-law scaling and statistical regularity
Directory of Open Access Journals (Sweden)
H.-m. Hsu
2011-08-01
Full Text Available With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO_{2}, NO_{x}, CO, SO_{2}, and O_{3} and aerosol (PM_{10} measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days peaks are also identified with the continuous wavelet transform (CWT. The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of −5/3 and −1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, −3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.
Statistical measurement of power spectrum density of large aperture optical component
International Nuclear Information System (INIS)
Xu Jiancheng; Xu Qiao; Chai Liqun
2010-01-01
According to the requirement of ICF, a method based on statistical theory has been proposed to measure the power spectrum density (PSD) of large aperture optical components. The method breaks the large-aperture wavefront into small regions, and obtains the PSD of the large-aperture wavefront by weighted averaging of the PSDs of the regions, where the weight factor is each region's area. Simulation and experiment demonstrate the effectiveness of the proposed method. They also show that, the obtained PSDs of the large-aperture wavefront by statistical method and sub-aperture stitching method fit well, when the number of small regions is no less than 8 x 8. The statistical method is not sensitive to translation stage's errors and environment instabilities, thus it is appropriate for PSD measurement during the process of optical fabrication. (authors)
Demographic statistics pertaining to nuclear power reactor sites
International Nuclear Information System (INIS)
1979-10-01
Population statistics are presented for 145 nuclear power plant sites. Summary tables and figures are included that were developed to aid in the evaluation of trends and general patterns associated with the various parameters of interest, such as the proximity of nuclear plant sites to centers of population. The primary reason for publishing this information at this time is to provide a factual basis for use in discussions on the subject of reactor siting policy. The report is a revised and updated version of a draft report published in December 1977. Errors in the population data base have been corrected and new data tabulations added
Energy and the need for nuclear power
International Nuclear Information System (INIS)
1982-11-01
The subject is discussed under the headings: fuel and mankind (world population estimates); fuel supply and demand (world nuclear and total primary energy demand forecasts); oil dependence; oil, gas and coal (world oil production and consumption; world coal reserves); nuclear option (consumption of nuclear energy in Western Europe; nuclear plant worldwide at December 1981; uranium reserves 1981); renewable resources; price of energy; Britain's need for nuclear power. (U.K.)
Who Needs Statistics? | Poster
You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.
Future radioisotope power needs for missions to the solar system
International Nuclear Information System (INIS)
Mondt, J.F.; Underwood, M.L.; Nesmith, B.J.
1997-01-01
NASA and DOE plan a cooperative team effort with industry, government laboratories and universities to develop a near term, low cost, low power (100 watt electric class), low mass (<10 kg), advanced radioisotope space power source (ARPS) and in the process reduce the plutonium-related costs as well. The near term is focused on developing an advanced energy converter to use with the General Purpose Heat Source (GPHS). The GPHS was developed and used for the current radioisotope thermoelectric generators (RTGs). Advanced energy converter technologies are needed as a more efficient replacement for the existing thermoelectric converters so that the space radioisotope power source mass and cost can be reduced. a more advanced technology space radioisotope power system program is also planned that addresses a longer-term need. Twenty first century robotic scientific information missions to the outer planets and beyond are planned to be accomplished with microspacecraft which may demand safe, even more compact, lower-power, lower-mass radioisotope power sources than those which can be achieved as a result of the near term efforts. The longer-term program focuses not only on converter technology but also on lower power, more compact radioisotope heat source technology and smaller, lower mass radioisotope heater units for second generation microspacecraft. This more ambitious, longer time-horizon focus necessarily occurs at this time on the technology R and D level rather than at the system technology level
International Nuclear Information System (INIS)
Oelgaard, P.L.
1986-06-01
In this report an attempt is made to collect literature data on nuclear power production and to present it on graphical form. Data is given not only for 1985, but for a number of years so that the trends in the development of nuclear power can be seen. The global capacity of nuclear power plants in operation and those in operation, under construction, or on order is considered. Further the average capacity factor for nuclear plants of a specific type and for various geographical areas is given. The contribution of nuclear power to the total electricity production is considered for a number of countries and areas. Finally, the accumulated years of commercial operation for the various reactor types up to the end of 1985 is presented. (author)
Simulations and cosmological inference: A statistical model for power spectra means and covariances
International Nuclear Information System (INIS)
Schneider, Michael D.; Knox, Lloyd; Habib, Salman; Heitmann, Katrin; Higdon, David; Nakhleh, Charles
2008-01-01
We describe an approximate statistical model for the sample variance distribution of the nonlinear matter power spectrum that can be calibrated from limited numbers of simulations. Our model retains the common assumption of a multivariate normal distribution for the power spectrum band powers but takes full account of the (parameter-dependent) power spectrum covariance. The model is calibrated using an extension of the framework in Habib et al. (2007) to train Gaussian processes for the power spectrum mean and covariance given a set of simulation runs over a hypercube in parameter space. We demonstrate the performance of this machinery by estimating the parameters of a power-law model for the power spectrum. Within this framework, our calibrated sample variance distribution is robust to errors in the estimated covariance and shows rapid convergence of the posterior parameter constraints with the number of training simulations.
Nordel - Availability statistics for thermal power plants 1995. (Denmark, Finland, Sweden)
International Nuclear Information System (INIS)
1996-01-01
The power companies of Denmark, Finland and Sweden have agreed on almost identical procedures for the recording and analysing of data describing the availability of power producing units over a certain capacity. Since 1975 the data for all three countries have been summarized and published in a joint report. The purpose of this report is to present some basic information about the operation of power producing units in the three countries. Referring to the report, companies or bodies will be able to exchange more detailed information with other companies or bodies in any of the countries. The report includes power producing units using fossil fuels, nuclear power plants and gas turbines. The information is presented separately for each country with a joint NORDEL statistics for units using fossil fuels, arranged in separate groups according to the type of fossil fuel which is used. The grouping of power producing units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. The definitions in NORDEL's 'Tillgaenglighetsbegrepp foer vaermekraft' ('The Concept of Availability for Thermal Power'), September 1977, are used in this report. The basic data for the availability are in accordance with the recommendations of UNIPEDE/WEC. (author)
A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling
Tabassum, Hina
2012-10-03
This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.
A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling
Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim
2012-01-01
This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.
A multivariate statistical study on a diversified data gathering system for nuclear power plants
International Nuclear Information System (INIS)
Samanta, P.K.; Teichmann, T.; Levine, M.M.; Kato, W.Y.
1989-02-01
In this report, multivariate statistical methods are presented and applied to demonstrate their use in analyzing nuclear power plant operational data. For analyses of nuclear power plant events, approaches are presented for detecting malfunctions and degradations within the course of the event. At the system level, approaches are investigated as a means of diagnosis of system level performance. This involves the detection of deviations from normal performance of the system. The input data analyzed are the measurable physical parameters, such as steam generator level, pressurizer water level, auxiliary feedwater flow, etc. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients and computer simulation of a plant system performance (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to explore statistically the detection of failure trends and patterns and prevention of conditions with serious safety implications. 33 refs., 18 figs., 9 tabs
In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.
Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat
2014-11-01
The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells. Copyright © 2014 Elsevier B.V. All rights reserved.
Fission nuclear power prospects and its role in meeting global energy needs
International Nuclear Information System (INIS)
Golan, S.
1992-01-01
Nuclear power currently makes an important contribution to world's energy requirements providing 17% of its electricity. But as global warming becomes of greater concern, many ask whether nuclear power can and should contribute more. The author, who is involved in the nuclear power enterprise for 35 years, tries to answer this question affirmative. He holds the view that: a) nuclear fission power is essential to meeting world's energy needs without unduly impairing the global environment; b) by possessing the required attributes discussed in this paper, nuclear fission power can be made societally acceptable; c) the industrialized world should accelerate LMFR deployment while fostering more convenient energy alternatives for the developing world; and d) the HTGR is unique in its ability to augment non-electricity energy needs and could become the technology choice of developing countries for nuclear electricity production. (author). 5 refs., 5 figs., 4 tabs
Production-distribution of electric power in France: 1997-98 statistical data
International Nuclear Information System (INIS)
1999-01-01
This document has been realized using the annual inquiry carried out by the French direction of gas, electricity and coal (Digec). It brings together the main statistical data about the production, transport and consumption of electric power in France: 1997 and 1998 balance sheets, foreign exchanges, long-term evolutions, production with respect to the different energy sources, consumption in the different departments and regions.. (J.S.)
Statistical modeling of the power grid from a wind farm standpoint
DEFF Research Database (Denmark)
Farajzadehbibalan, Saber; Ramezani, Mohammad H.; Nielsen, Peter
2017-01-01
wind farm over several years which results in the development of a useful model for practical purposes. Secondly, the derived model is computationally inexpensive. Considering an arbitrary wind turbine generator, we show that the behavior of the power grid at the connection point can be represented......In this study, we derive a statistical model of a power grid from the wind farm's standpoint based on dynamic principal component analysis. The main advantages of our model compared to the previously developed models are twofold. Firstly, our proposed model benefits from logged data of an offshore...... by 4 out of 9 registered variables, i.e. 3-phase voltages, 3-phase currents, frequency, and generated active and reactive powers. We further prove that the dynamic nature of the system can be optimally captured by a time lag shift of two samples. To extend the derived model of a wind turbine generator...
Future wind power forecast errors, need for regulating power, and costs in the Swedish system
Energy Technology Data Exchange (ETDEWEB)
Carlsson, Fredrik [Vattenfall Research and Development AB, Stockholm (Sweden). Power Technology
2011-07-01
Wind power is one of the renewable energy sources in the electricity system that grows most rapid in Sweden. There are however two market challenges that need to be addressed with a higher proportion of wind power - that is variability and predictability. Predictability is important since the spot market Nord Pool Spot requires forecasts of production 12 - 36 hours ahead. The forecast errors must be regulated with regulating power, which is expensive for the actors causing the forecast errors. This paper has investigated a number of scenarios with 10 - 55 TWh of wind power installed in the Swedish system. The focus has been on a base scenario with 10 TWh new wind power consisting of 3,5 GW new wind power and 1,5 GW already installed power, which gives 5 GW. The results show that the costs for the forecast errors will increase as more intermittent production is installed. However, the increase can be limited by for instance trading on intraday market or increase quality of forecasts. (orig.)
International Nuclear Information System (INIS)
Kashiwa, Takako; Kawamoto, Yoshimi
2013-01-01
In the light of the Fukushima Daiichi nuclear power plant accident, we need to consider a symbiosis method based on the diminution of the nuclear power industry. To find a region that does not excessively depend on the nuclear power industry, it is necessary to examine and discuss the social impact of nuclear-related industries. In this study, we compared people's changing information needs of social impact before and after the Fukushima Daiichi nuclear power plant accident. It was found that the need for information increased after the accident. In particular, there were three research areas where the need for information increased: the consideration of building nuclear power plants, the influence of harmful rumors on the region, and influence on the nuclear power industry. Next, attempts were made to understand whether there is a difference between information needs of social impact by attributes, such as age, sex and knowledge of nuclear power. The information needs of the following categories of people increased after the accident: people aged between 10 and 50 years, women, people who do not have a clear opinion about the use of a nuclear power plant, and people who do not have any knowledge of nuclear power. (author)
Shifflett, Benjamin; Huang, Rong; Edland, Steven D
2017-01-01
Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.
Directory of Open Access Journals (Sweden)
Emmanouil Styvaktakis
2007-01-01
Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.
In vivo Comet assay – statistical analysis and power calculations of mice testicular cells
DEFF Research Database (Denmark)
Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne
2014-01-01
is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636....... A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most...... consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells....
Statistical modeling of an integrated boiler for coal fired thermal power plant.
Chandrasekharan, Sreepradha; Panda, Rames Chandra; Swaminathan, Bhuvaneswari Natrajan
2017-06-01
The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R 2 analysis and ANOVA (Analysis of Variance). The dependability of the process variable (temperature) on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM) supported by DOE (design of experiments) are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant.
Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz
2015-03-01
FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.
Statistics on Lie groups: A need to go beyond the pseudo-Riemannian framework
Miolane, Nina; Pennec, Xavier
2015-01-01
Lie groups appear in many fields from Medical Imaging to Robotics. In Medical Imaging and particularly in Computational Anatomy, an organ's shape is often modeled as the deformation of a reference shape, in other words: as an element of a Lie group. In this framework, if one wants to model the variability of the human anatomy, e.g. in order to help diagnosis of diseases, one needs to perform statistics on Lie groups. A Lie group G is a manifold that carries an additional group structure. Statistics on Riemannian manifolds have been well studied with the pioneer work of Fréchet, Karcher and Kendall [1, 2, 3, 4] followed by others [5, 6, 7, 8, 9]. In order to use such a Riemannian structure for statistics on Lie groups, one needs to define a Riemannian metric that is compatible with the group structure, i.e a bi-invariant metric. However, it is well known that general Lie groups which cannot be decomposed into the direct product of compact and abelian groups do not admit a bi-invariant metric. One may wonder if removing the positivity of the metric, thus asking only for a bi-invariant pseudo-Riemannian metric, would be sufficient for most of the groups used in Computational Anatomy. In this paper, we provide an algorithmic procedure that constructs bi-invariant pseudo-metrics on a given Lie group G. The procedure relies on a classification theorem of Medina and Revoy. However in doing so, we prove that most Lie groups do not admit any bi-invariant (pseudo-) metric. We conclude that the (pseudo-) Riemannian setting is not the richest setting if one wants to perform statistics on Lie groups. One may have to rely on another framework, such as affine connection space.
The Need for the Dissemination of Statistical Data and Information
Directory of Open Access Journals (Sweden)
Anna-Alexandra Frunza
2016-01-01
Full Text Available There is an emphasis nowadays on knowledge, so the access to information has increased inrelevance in the modern economies which have developed their competitive advantage thoroughtheir dynamic response to the market changes. The effort for transparency has increasedtremendously within the last decades which have been also influenced by the weight that the digitalsupport has provided. The need for the dissemination of statistical data and information has metnew challenges in terms of aggregating the practices that both private and public organizations usein order to ensure the optimum access to the end users. The article stresses some key questions thatcan be introduced which ease the process of collection and presentation of the results subject todissemination.
Generation of statistical scenarios of short-term wind power production
DEFF Research Database (Denmark)
Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd
2007-01-01
Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...
The PowerAtlas: a power and sample size atlas for microarray experimental design and research
Directory of Open Access Journals (Sweden)
Wang Jelai
2006-02-01
Full Text Available Abstract Background Microarrays permit biologists to simultaneously measure the mRNA abundance of thousands of genes. An important issue facing investigators planning microarray experiments is how to estimate the sample size required for good statistical power. What is the projected sample size or number of replicate chips needed to address the multiple hypotheses with acceptable accuracy? Statistical methods exist for calculating power based upon a single hypothesis, using estimates of the variability in data from pilot studies. There is, however, a need for methods to estimate power and/or required sample sizes in situations where multiple hypotheses are being tested, such as in microarray experiments. In addition, investigators frequently do not have pilot data to estimate the sample sizes required for microarray studies. Results To address this challenge, we have developed a Microrarray PowerAtlas 1. The atlas enables estimation of statistical power by allowing investigators to appropriately plan studies by building upon previous studies that have similar experimental characteristics. Currently, there are sample sizes and power estimates based on 632 experiments from Gene Expression Omnibus (GEO. The PowerAtlas also permits investigators to upload their own pilot data and derive power and sample size estimates from these data. This resource will be updated regularly with new datasets from GEO and other databases such as The Nottingham Arabidopsis Stock Center (NASC. Conclusion This resource provides a valuable tool for investigators who are planning efficient microarray studies and estimating required sample sizes.
International Nuclear Information System (INIS)
2003-01-01
This document presents the statistical annual report of Furnas Power Plants and Co, reporting the results obtained during the calendar year of 2003 and the evolution in the last five years, allowing a general and comparative views of the company performance focusing the power generation and transmission, economic and financial results
Mission needs and system commonality for space nuclear power and propulsion
International Nuclear Information System (INIS)
Buden, D.; Zuppero, A.; Redd, L.
1993-01-01
Nuclear power enables or significantly enhances a variety of space missions whether near-Earth, or for solar system exploration, lunar-Mars exploration and recovery of near-Earth resources. Performance optimizations for individual missions leads to a large number of power and propulsion systems to be developed. However, the realities of the budget and schedules indicates that the number of nuclear systems that will be developed are limited. One needs to seek the ''minimum requirements'' to do a job rather than the last ounce of performance, and areas of commonality. To develop a minimum number of systems to meet the overall DoD, NASA, and commercial needs, the broad spectrum of requirements has been examined along with cost drivers
Statistical study of undulator radiated power by a classical detection system in the mm-wave regime
Directory of Open Access Journals (Sweden)
A. Eliran
2009-05-01
Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.
Statistical modeling of an integrated boiler for coal fired thermal power plant
Directory of Open Access Journals (Sweden)
Sreepradha Chandrasekharan
2017-06-01
Full Text Available The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R2 analysis and ANOVA (Analysis of Variance. The dependability of the process variable (temperature on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM supported by DOE (design of experiments are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant. Keywords: Chemical engineering, Applied mathematics
Economic analysis of the need for advanced power sources
International Nuclear Information System (INIS)
Hardie, R.W.; Omberg, R.P.
1975-01-01
The purpose of this paper is to determine the economic need for an advanced power source, be it fusion, solar, or some other concept. However, calculations were also performed assuming abandonment of the LMFBR program, so breeder benefits are a by-product of this study. The model used was the ALPS linear programming system for forecasting optimum power growth patterns. Total power costs were calculated over a planning horizon from 1975 to 2041 and discounted at 7 1 / 2 percent. The benefit of a particular advanced power source is simply the reduction in total power cost resulting from its introduction. Since data concerning advanced power sources (APS) are speculative, parametric calculations varying introduction dates and capital costs about a hypothetical APS plant were performed. Calculations were also performed without the LMFBR to determine the effect of the breeder on the benefits of an advanced power source. Other data used in the study, such as the energy demand curve and uranium resource estimates, are given in the Appendix, and a list of the 11 power plants used in this study is given. Calculations were performed for APS introduction dates of 2001 and 2011. Estimates of APS capital costs included cases where it was assumed the costs were $50/kW and $25/kW higher than the LMFBR. In addition, cases where APS and LMFBR capital costs are identical were also considered. It is noted that the APS capital costs used in this study are not estimates of potential advanced power system plant costs, but were chosen to compute potential dollar benefits of advanced power systems under extremely optimistic assumptions. As a further example, all APS fuel cycle costs were assumed to be zero
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Kies, Alexander; Nag, Kabitri; von Bremen, Lueder; Lorenz, Elke; Heinemann, Detlev
2015-04-01
The penetration of renewable energies in the European power system has increased in the last decades (23.5% share of renewables in the gross electricity consumption of the EU-28 in 2012) and is expected to increase further up to very high shares close to 100%. Planning and organizing this European energy transition towards sustainable power sources will be one of the major challenges of the 21st century. It is very likely that in a fully renewable European power system wind and photovoltaics (pv) will contribute the largest shares to the generation mix followed by hydro power. However, feed-in from wind and pv is due to the weather dependant nature of their resources fluctuating and non-controllable. To match generation and consumption several solutions and their combinations were proposed like very high backup-capacities of conventional power generation (e.g. fossile or nuclear), storages or the extension of the transmission grid. Apart from those options hydro power can be used to counterbalance fluctuating wind and pv generation to some extent. In this work we investigate the effects of hydro power from Norway and Sweden on residual storage needs in Europe depending on the overlaying grid scenario. High temporally and spatially resolved weather data with a spatial resolution of 7 x 7 km and a temporal resolution of 1 hour was used to model the feed-in from wind and pv for 34 investigated European countries for the years 2003-2012. Inflow into hydro storages and generation by run-of-river power plants were computed from ERA-Interim reanalysis runoff data at a spatial resolution of 0.75° x 0.75° and a daily temporal resolution. Power flows in a simplified transmission grid connecting the 34 European countries were modelled minimizing dissipation using a DC-flow approximation. Previous work has shown that hydro power, namely in Norway and Sweden, can reduce storage needs in a renewable European power system by a large extent. A 15% share of hydro power in Europe
Development of nuclear power plant online monitoring system using statistical quality control
International Nuclear Information System (INIS)
An, Sang Ha
2006-02-01
Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability
Joint probability of statistical success of multiple phase III trials.
Zhang, Jianliang; Zhang, Jenny J
2013-01-01
In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.
Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert
2007-03-01
This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.
Simulating European wind power generation applying statistical downscaling to reanalysis data
International Nuclear Information System (INIS)
González-Aparicio, I.; Monforti, F.; Volker, P.; Zucker, A.; Careri, F.; Huld, T.; Badger, J.
2017-01-01
Highlights: •Wind speed spatial resolution highly influences calculated wind power peaks and ramps. •Reduction of wind power generation uncertainties using statistical downscaling. •Publicly available dataset of wind power generation hourly time series at NUTS2. -- Abstract: The growing share of electricity production from solar and mainly wind resources constantly increases the stochastic nature of the power system. Modelling the high share of renewable energy sources – and in particular wind power – crucially depends on the adequate representation of the intermittency and characteristics of the wind resource which is related to the accuracy of the approach in converting wind speed data into power values. One of the main factors contributing to the uncertainty in these conversion methods is the selection of the spatial resolution. Although numerical weather prediction models can simulate wind speeds at higher spatial resolution (up to 1 × 1 km) than a reanalysis (generally, ranging from about 25 km to 70 km), they require high computational resources and massive storage systems: therefore, the most common alternative is to use the reanalysis data. However, local wind features could not be captured by the use of a reanalysis technique and could be translated into misinterpretations of the wind power peaks, ramping capacities, the behaviour of power prices, as well as bidding strategies for the electricity market. This study contributes to the understanding what is captured by different wind speeds spatial resolution datasets, the importance of using high resolution data for the conversion into power and the implications in power system analyses. It is proposed a methodology to increase the spatial resolution from a reanalysis. This study presents an open access renewable generation time series dataset for the EU-28 and neighbouring countries at hourly intervals and at different geographical aggregation levels (country, bidding zone and administrative
Power flow as a complement to statistical energy analysis and finite element analysis
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants
Rajasekar, Vidyashree
This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly
Do we need statistics when we have linguistics?
Directory of Open Access Journals (Sweden)
Cantos Gómez Pascual
2002-01-01
Full Text Available Statistics is known to be a quantitative approach to research. However, most of the research done in the fields of language and linguistics is of a different kind, namely qualitative. Succinctly, qualitative analysis differs from quantitative analysis is that in the former no attempt is made to assign frequencies, percentages and the like, to the linguistic features found or identified in the data. In quantitative research, linguistic features are classified and counted, and even more complex statistical models are constructed in order to explain these observed facts. In qualitative research, however, we use the data only for identifying and describing features of language usage and for providing real occurrences/examples of particular phenomena. In this paper, we shall try to show how quantitative methods and statistical techniques can supplement qualitative analyses of language. We shall attempt to present some mathematical and statistical properties of natural languages, and introduce some of the quantitative methods which are of the most value in working empirically with texts and corpora, illustrating the various issues with numerous examples and moving from the most basic descriptive techniques (frequency counts and percentages to decision-taking techniques (chi-square and z-score and to more sophisticated statistical language models (Type-Token/Lemma-Token/Lemma-Type formulae, cluster analysis and discriminant function analysis.
Energy Technology Data Exchange (ETDEWEB)
In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
1995-02-01
A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.
The Need for Intelligent Control of Space Power Systems
May, Ryan David; Soeder, James F.; Beach, Raymond F.; McNelis, Nancy B.
2013-01-01
As manned spacecraft venture farther from Earth, the need for reliable, autonomous control of vehicle subsystems becomes critical. This is particularly true for the electrical power system which is critical to every other system. Autonomy can not be achieved by simple scripting techniques due to the communication latency times and the difficulty associated with failures (or combinations of failures) that need to be handled in as graceful a manner as possible to ensure system availability. Therefore an intelligent control system must be developed that can respond to disturbances and failures in a robust manner and ensure that critical system loads are served and all system constraints are respected.
A testing procedure for wind turbine generators based on the power grid statistical model
DEFF Research Database (Denmark)
Farajzadehbibalan, Saber; Ramezani, Mohammad Hossein; Nielsen, Peter
2017-01-01
In this study, a comprehensive test procedure is developed to test wind turbine generators with a hardware-in-loop setup. The procedure employs the statistical model of the power grid considering the restrictions of the test facility and system dynamics. Given the model in the latent space...
International Nuclear Information System (INIS)
1977-12-01
This leaflet examines our energy future and concludes that nuclear power is an essential part of it. The leaflet also discusses relative costs, but it does not deal with social and environmental implications of nuclear power in any detail, since these are covered by other British Nuclear Forum publications. Headings are: present consumption; how will this change in future; primary energy resources (fossil fuels; renewable resources; nuclear); energy savings; availability of fossil fuels; availability of renewable energy resources; the contribution of thermal nuclear power; electricity; costs for nuclear power. (U.K.)
The need for information in the power market; Informasjonsbehovet i kraftmarkedet
Energy Technology Data Exchange (ETDEWEB)
Gaasland, I.
1995-08-01
This report evaluates the need for information in the Norwegian power market. In particular it emphasizes discussions on possible efficiency effects of collecting and publishing information on trade through bilateral contracts. It also looks into other areas where there may be a need for public intervention to secure more equal access to information. One of the main conclusions is that the collecting and publishing of information mentioned above will not be a very effective means of securing equal access to information. The fundamental source of potentially unequal information access in the market seems to be a concentration or a market power among the producers. The unfortunate situation that Statoil is responsible for a major part of the foreign trade besides being itself one of the actors in the market is pointed out. 9 refs.
Kling, Daniel; Egeland, Thore; Piñero, Mariana Herrera; Vigeland, Magnus Dehli
2017-11-01
Methods and implementations of DNA-based identification are well established in several forensic contexts. However, assessing the statistical power of these methods has been largely overlooked, except in the simplest cases. In this paper we outline general methods for such power evaluation, and apply them to a large set of family reunification cases, where the objective is to decide whether a person of interest (POI) is identical to the missing person (MP) in a family, based on the DNA profile of the POI and available family members. As such, this application closely resembles database searching and disaster victim identification (DVI). If parents or children of the MP are available, they will typically provide sufficient statistical evidence to settle the case. However, if one must resort to more distant relatives, it is not a priori obvious that a reliable conclusion is likely to be reached. In these cases power evaluation can be highly valuable, for instance in the recruitment of additional family members. To assess the power in an identification case, we advocate the combined use of two statistics: the Probability of Exclusion, and the Probability of Exceedance. The former is the probability that the genotypes of a random, unrelated person are incompatible with the available family data. If this is close to 1, it is likely that a conclusion will be achieved regarding general relatedness, but not necessarily the specific relationship. To evaluate the ability to recognize a true match, we use simulations to estimate exceedance probabilities, i.e. the probability that the likelihood ratio will exceed a given threshold, assuming that the POI is indeed the MP. All simulations are done conditionally on available family data. Such conditional simulations have a long history in medical linkage analysis, but to our knowledge this is the first systematic forensic genetics application. Also, for forensic markers mutations cannot be ignored and therefore current models and
A Statistical Approach to Planning Reserved Electric Power for Railway Infrastructure Administration
Brabec, M. (Marek); Pelikán, E. (Emil); Konár, O. (Ondřej); Kasanický, I. (Ivan); Juruš, P. (Pavel); Sadil, J.; Blažek, P.
2013-01-01
One of the requirements on railway infrastructure administration is to provide electricity for day-to-day operation of railways. We propose a statistically based approach for the estimation of maximum 15-minute power within a calendar month for a given region. This quantity serves as a basis of contracts between railway infrastructure administration and electricity distribution system operator. We show that optimization of the prediction is possible, based on underlying loss function deriv...
A new Markov-chain-related statistical approach for modelling synthetic wind power time series
International Nuclear Information System (INIS)
Pesch, T; Hake, J F; Schröders, S; Allelein, H J
2015-01-01
The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)
Directory of Open Access Journals (Sweden)
Ozonoff Al
2010-07-01
Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression
The influence of the presence of deviant item score patterns on the power of a person-fit statistic
Meijer, R.R.
1994-01-01
In studies investigating the power of person-fit statistics it is often assumed that the item parameters that are used to calculate the statistics can be estimated in a sample without aberrant persons. However, in practical test applications calibration samples most likely will contain aberrant
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
Energy Technology Data Exchange (ETDEWEB)
Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
International Nuclear Information System (INIS)
Sibatov, R T
2011-01-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
Meeting our need for electric energy: the role of nuclear power
International Nuclear Information System (INIS)
1984-07-01
This report focuses on the projected long-term growth of electric demand and the resultant need for new electric generating capacity through the year 2010. It summarizes the results of several technical and economic analyses done over the past two years to present two alternative scenarios for the future growth of nuclear energy in the United States. The first of these scenarios is based on a reference assumption of continued economic recovery and growth, while the second assumes a more vigorous economic recovery. These alternative scenarios reflect both the role that electricity could play in assuring the future economic wellbeing of the United States and the role that nuclear power could play in meeting future electricity needs. The scenarios do not project an expected future; rather, they describe a future that can be achieved only if US industry is revitalized in several key areas and if current obstacles to construction and operation of nuclear power plants are removed. This report underscores the need for renewed domestic industrialization as well as the need for government and industry to take steps to allow nuclear energy to fulfill its original potential. Further, it suggests some specific actions that must be taken if these goals are to be met
Do doctors need statistics? Doctors' use of and attitudes to probability and statistics.
Swift, Louise; Miles, Susan; Price, Gill M; Shepstone, Lee; Leinster, Sam J
2009-07-10
There is little published evidence on what doctors do in their work that requires probability and statistics, yet the General Medical Council (GMC) requires new doctors to have these skills. This study investigated doctors' use of and attitudes to probability and statistics with a view to informing undergraduate teaching.An email questionnaire was sent to 473 clinicians with an affiliation to the University of East Anglia's Medical School.Of 130 respondents approximately 90 per cent of doctors who performed each of the following activities found probability and statistics useful for that activity: accessing clinical guidelines and evidence summaries, explaining levels of risk to patients, assessing medical marketing and advertising material, interpreting the results of a screening test, reading research publications for general professional interest, and using research publications to explore non-standard treatment and management options.Seventy-nine per cent (103/130, 95 per cent CI 71 per cent, 86 per cent) of participants considered probability and statistics important in their work. Sixty-three per cent (78/124, 95 per cent CI 54 per cent, 71 per cent) said that there were activities that they could do better or start doing if they had an improved understanding of these areas and 74 of these participants elaborated on this. Themes highlighted by participants included: being better able to critically evaluate other people's research; becoming more research-active, having a better understanding of risk; and being better able to explain things to, or teach, other people.Our results can be used to inform how probability and statistics should be taught to medical undergraduates and should encourage today's medical students of the subjects' relevance to their future careers. Copyright 2009 John Wiley & Sons, Ltd.
Spreadsheets as tools for statistical computing and statistics education
Neuwirth, Erich
2000-01-01
Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Powerful Inference with the D-Statistic on Low-Coverage Whole-Genome Data.
Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders
2018-02-02
The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness is assessed by evaluating specific coincidences of alleles between the groups. When working with high-throughput sequencing data, calling genotypes accurately is not always possible; therefore, the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction to combat the problems of sequencing errors, and show a way to correct for introgression from an external population that is not part of the supposed genetic relationship, and how this leads to an estimate of the admixture rate. We prove that the D-statistic is approximated by a standard normal distribution. Furthermore, we show that our method outperforms the traditional D-statistic in detecting admixtures. The power gain is most pronounced for low and medium sequencing depth (1-10×), and performances are as good as with perfectly called genotypes at a sequencing depth of 2×. We show the reliability of error correction in scenarios with simulated errors and ancient data, and correct for introgression in known scenarios to estimate the admixture rates. Copyright © 2018 Soraggi et al.
Fulfilling the needs for statistical expertise at Aalborg Hospital
DEFF Research Database (Denmark)
Dethlefsen, Claus
In 2005, the first statistician was employed at Aalborg Hospital due to expanding research activities as part of Aarhus University Hospital. Since then, there has been an increased demand for statistical expertise at all levels. In the talk, I will give an overview of the current staff...... of statisticians and the organisation. I will give examples from our statistical consultancy and illustrate some of the challenges that have led to research projects with heavy statistical involvement....
Hysteresis and Power-Law Statistics during temperature induced martensitic transformation
International Nuclear Information System (INIS)
Paul, Arya; Sengupta, Surajit; Rao, Madan
2011-01-01
We study hysteresis in temperature induced martensitic transformation using a 2D model solid exhibiting a square to rhombic structural transition. We find that upon quenching, the high temperature square phase, martensites are nucleated at sites having large non-affineness and ultimately invades the whole of the high temperature square phase. On heating the martensite, the high temperature square phase is restored. The transformation proceeds through avalanches. The amplitude and the time-duration of these avalanches exhibit power-law statistics both during heating and cooling of the system. The exponents corresponding to heating and cooling are different thereby indicating that the nucleation and dissolution of the product phase follows different transformation mechanism.
Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura
2017-07-01
Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.
Directory of Open Access Journals (Sweden)
Yolanda Escalante
2012-09-01
Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots
Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M
2011-12-01
This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.
DEFF Research Database (Denmark)
Denwood, M.J.; McKendrick, I.J.; Matthews, L.
Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...
From probabilistic forecasts to statistical scenarios of short-term wind power production
DEFF Research Database (Denmark)
Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd
2009-01-01
on the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate closures......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...
The role of the dorsoanterior striatum in implicit motivation: The case of the need for power
Directory of Open Access Journals (Sweden)
Oliver C Schultheiss
2013-04-01
Full Text Available Implicit motives like the need for power (nPower scale affective responses to need-specific rewards or punishments and thereby influence activity in motivational-brain structures. In this paper, we review evidence specifically supporting a role of the striatum in nPower. Individual differences in nPower predict (a enhanced implicit learning accuracy, but not speed, on serial-response tasks that are reinforced by power-related incentives (e.g., winning or losing a contest; dominant or submissive emotional expressions in behavioral studies and (b activation of the anterior caudate in response to dominant emotional expressions in brain imaging research. We interpret these findings on the basis of Hikosaka, Nakamura, Sakai, and Nakahara's (2002; Current Opinion in Neurobiology, 12(2, 217-222 model of central mechanisms of motor skill learning. The model assigns a critical role to the dorsoanterior striatum in dopamine-driven learning of spatial stimulus sequences. Based on this model, we suggest that the dorsoanterior striatum is the locus of nPower-dependent reinforcement. However, given the centrality of this structure in a wide range of motivational pursuits, we also propose that activity in the dorsoanterior striatum may not only reflect individual differences in nPower, but also in other implicit motives, like the need for achievement or the need for affiliation, provided that the proper incentives for these motives are present during reinforcement learning. We discuss evidence in support of such a general role of the dorsoanterior striatum in implicit motivation.
Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.
2014-03-01
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.
Statistical analysis about corrosion in nuclear power plants
International Nuclear Information System (INIS)
Naquid G, C.; Medina F, A.; Zamora R, L.
1999-01-01
Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)
DEFF Research Database (Denmark)
Heide, Dominik; Greiner, Martin; von Bremen, Lüder
The storage and balancing needs of a simplified European power system, which is based on wind and solar power generation only, are derived from an extensive weather-driven modeling of hourly power mismatches between generation and load. The storage energy capacity, the annual balancing energy...... and the balancing power are found to depend significantly on the mixing ratio between wind and solar power generation. They decrease strongly with the overall excess generation. At 50% excess generation the required long-term storage energy capacity and annual balancing energy amount to 1% of the annual consumption....... The required balancing power turns out to be 25% of the average hourly load. These numbers are in agreement with current hydro storage lakes in Scandinavia and the Alps, as well as with potential hydrogen storage in mostly North-German salt caverns....
DEFF Research Database (Denmark)
Heide, Dominik; Greiner, Martin; von Bremen, Lüder
2011-01-01
The storage and balancing needs of a simplified European power system, which is based on wind and solar power generation only, are derived from an extensive weather-driven modeling of hourly power mismatches between generation and load. The storage energy capacity, the annual balancing energy...... and the balancing power are found to depend significantly on the mixing ratio between wind and solar power generation. They decrease strongly with the overall excess generation. At 50% excess generation the required long-term storage energy capacity and annual balancing energy amount to 1% of the annual consumption....... The required balancing power turns out to be 25% of the average hourly load. These numbers are in agreement with current hydro storage lakes in Scandinavia and the Alps, as well as with potential hydrogen storage in mostly North-German salt caverns....
A statistical model of uplink inter-cell interference with slow and fast power control mechanisms
Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim
2013-01-01
Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.
A statistical model of uplink inter-cell interference with slow and fast power control mechanisms
Tabassum, Hina
2013-09-01
Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.
International Nuclear Information System (INIS)
Behringer, K.; Spiekerman, G.
1984-01-01
Piety (1977) proposed an automated signature analysis of power spectral density data. Eight statistical decision discriminants are introduced. For nearly all the discriminants, improved confidence statements can be made. The statistical characteristics of the last three discriminants, which are applications of non-parametric tests, are considered. (author)
2014-01-01
Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304
Statistical learning: a powerful mechanism that operates by mere exposure.
Aslin, Richard N
2017-01-01
How do infants learn so rapidly and with little apparent effort? In 1996, Saffran, Aslin, and Newport reported that 8-month-old human infants could learn the underlying temporal structure of a stream of speech syllables after only 2 min of passive listening. This demonstration of what was called statistical learning, involving no instruction, reinforcement, or feedback, led to dozens of confirmations of this powerful mechanism of implicit learning in a variety of modalities, domains, and species. These findings reveal that infants are not nearly as dependent on explicit forms of instruction as we might have assumed from studies of learning in which children or adults are taught facts such as math or problem solving skills. Instead, at least in some domains, infants soak up the information around them by mere exposure. Learning and development in these domains thus appear to occur automatically and with little active involvement by an instructor (parent or teacher). The details of this statistical learning mechanism are discussed, including how exposure to specific types of information can, under some circumstances, generalize to never-before-observed information, thereby enabling transfer of learning. WIREs Cogn Sci 2017, 8:e1373. doi: 10.1002/wcs.1373 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu
2017-12-07
Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Eberhardt, L.L.; Thomas, J.M.
1986-07-01
This project was designed to develop guidance for implementing 10 CFR Part 61 and to determine the overall needs for sampling and statistical work in characterizing, surveying, monitoring, and closing commercial low-level waste sites. When cost-effectiveness and statistical reliability are of prime importance, then double sampling, compositing, and stratification (with optimal allocation) are identified as key issues. If the principal concern is avoiding questionable statistical practice, then the applicability of kriging (for assessing spatial pattern), methods for routine monitoring, and use of standard textbook formulae in reporting monitoring results should be reevaluated. Other important issues identified include sampling for estimating model parameters and the use of data from left-censored (less than detectable limits) distributions
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Sustaining the future: the role of nuclear power in meeting future world energy needs
International Nuclear Information System (INIS)
Duffey, R.; Sun, Y.
2003-01-01
A description is given of recently informed analyses showing the potential that nuclear power has in meeting global energy demands. For both the electricity and transportation sectors, we can quantify the beneficial effects on the environment, and we show how nuclear power deserves credit for its role in assisting future world energy, environmental and economic sustainability. The continuing expansion of the world's and Asia's energy needs, coupled with the need to reduce greenhouse gas (GHG) and other emissions, will require new approaches for large scale energy production and use. This is particularly important for China and Asia with respect to meeting both the energy demand and sustainability challenges. We show and explore the role of nuclear power for large-scale energy applications, including electricity production and hydrogen for transportation. Advanced nuclear technologies, such as those like CANDU's next generation ACR, can meet future global energy market needs, avoid emissions, and mitigate the potential for global climate change. We use the latest IPCC Scenarios out to the year 2100 as a base case, but correct them to examine the sensitivity to large scale nuclear and hydrogen fuel penetration. We show a significant impact of nuclear energy on energy market penetration, and in reducing GHGs and other emissions in the coming century, particularly in the industrial developing world and in Asia. This is achieved without needing emissions credits, as are used or needed as economic support for other sources, or for subsidies via emissions trading schemes. Nuclear power offers the relatively emissions-free means, both to provide electricity for traditional applications and, by electrolytic production of hydrogen, to extend its use deep into the transportation sector. For the published IPCC Marker Scenarios for Asia we show the reduction in GHG emissions when electrolysis using electricity from nuclear power assists the introduction of hydrogen as a fuel
High technology and the courts: nuclear power and the need for institutional reform
International Nuclear Information System (INIS)
Yellin, J.
1981-01-01
In this article Professor Yellin analyzes the performance of the courts when confronted with the important and complex issues attending the commercial development of nuclear power. He draws three general conclusions from the analysis: (1) the failure of nuclear regulation indicates that substantive review of agency decision making is necessary; (2) the limitations of the courts' ability to understand the scientific and technological arguments inherent in the nuclear power cases suggest the need for hybrid legal and scientific oversight of technological decisions; and (3) procedural requirements of the adversary system tend to impede full presentation of the issues in nuclear power cases, again pointing to the need for new systems of review. Professor Yellin proposes creation of a permanent review board composed of masters trained in both science and law to which technological and scientific issues falling outside the special competence of the judiciary would be referred by the federal appellate courts
Directory of Open Access Journals (Sweden)
Sandra Doeze Jager
2017-10-01
Full Text Available The present study focused on self-other agreement between employees on their Need for Achievement, Need for Power and Need for Affiliation, which needs are relevant for performance and wellbeing at work. The Social Relations Model was used to examine consensus between other-raters, self-other agreement and assumed similarity (seeing others as one sees oneself on these needs. Data were collected among 168 employees from a Dutch non-profit organization, with four employees in each of 42 teams. Consensus between other-raters occurred for all needs. Self-other agreement existed for the Needs for Achievement and Power, but not for Affiliation. Assumed similarity occurred for the Need for Achievement, but not for the other needs. Findings for the Need for Achievement demonstrate a traditional rating pattern exhibiting consensus, self-other agreement and assumed similarity. The absence of assumed similarity for the Need of Power implies that employees are able to distinguish between their own and their peers’ needs to have influence at work. The lack of self-other agreement for the Need for Affiliation may imply that improving others’ awareness of one’s need to connect is necessary to enhance one’s well-being at work. Our findings may be useful to organizations, as being knowledgeable about one’s employees’ needs is important to improve the fit between their needs and the job.
The role of nuclear power in meeting future U.S. energy needs
International Nuclear Information System (INIS)
Wiggin, E.A.
1977-01-01
Uranium and coal are domestically available fuels that provide a viable means for augmenting dwindling indigenous supplies of natural gas and imported oil to meet U.S. electric power needs during the remainder of the twentieth century. Their availability, coupled with a proven technology for utilizing them efficiently, offers the U.S. an attractive route to energy independence. Uranium, through the breeder, offers the further potential of helping the U.S. meet its energy needs into the twenty-first century - hopefully, until fusion and other high technology alternatives can be brought to commercialization. Notwithstanding the financial, regulatory and institutional problems that have plagued the U.S. nuclear power program over the past two years, the utility industry has maintained its commitment to nuclear power. This reflects to some extent, the absence of a viable alternative. But more importantly, it reflects a conviction on the part of the U.S. utility industry, based on an unmatched record of reliable and safe performance, that nuclear power offers the most economic and most environmentally acceptable means of generating steam to produce electricity. This paper considers the likelihood of the U.S. industry's further commitment to nuclear power vis-a-vis comparable commitments in other parts of the world. It points out wherein those commitments will be contingent upon closing the nuclear fuel cycle. It also addresses, the extent to which such commitments will be influenced by government policy and regulation, financing and manpower requisites, and the challenge of the critics. Finally, this paper points out wherein nuclear power is already a de facto key to a U.S. national energy policy and wherein it can further contribute to the goal of energy independence
International Nuclear Information System (INIS)
Zeljko, M.; Bajs, D.
1998-01-01
Due to development of electric power system and considering an increase of electrical energy consumption, needs for larger units in new power plants are obvious. Connection of large nuclear power plants to the grid, depending on their power and location, usually requires significant investments in transmission network development and construction. Considering the capacity of the 400 kV transmission network in Croatia, this problem is evident. This paper deals with the possibilities of nuclear power plants construction, as one possible option in electric power system development, and their interconnection to the electricity grid. (author)
Nuclear power plants: 2009 atw compact statistics
International Nuclear Information System (INIS)
Anon.
2010-01-01
At the turn of 2009/2010, nuclear power plants were available for energy supply in 30 countries of the world. A total of 437 nuclear power plants, which is one plant less than at the 2008/2009 turn, were in operation with an aggregate gross power of approx. 391 GWe and an aggregate net power, respectively, of 371 GWe. The available gross power of nuclear power plants did not changed noticeably from 2008 to the end of 2009. In total 2 nuclear generating units were commissioned in 2009. One NPP started operation in India and one in Japan. Three nuclear generating units in Japan (2) und Lithuania (1) were decomissioned in 2009. 52 nuclear generating units, i.e. 10 plants more than at the end of 2008, with an aggregate gross power of approx. 51 GWe, were under construction in 14 countries end of 2009. New or continued projects are notified from (number of new projects): China (+9), Russia (1), and South Korea (1). Some 84 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another units are in their preliminary project phases. (orig.)
Nuclear power plants: 2008 atw compact statistics
International Nuclear Information System (INIS)
Anon.
2009-01-01
At the turn of 2008/2009, nuclear power plants were available for energy supply in 31 countries of the world. A total of 438 nuclear power plants, which is one plant less than at the 2007/2008 turn, were in operation with an aggregate gross power of approx. 393 GWe and an aggregate net power, respectively, of 372 GWe. The available gross power of nuclear power plants didn't changed noticeabely from 2007 to the end of 2008. No nuclear generating unit was commissioned in 2008. One nuclear generating unit in the Slovak Republic was decomissioned in 2008. 42 nuclear generating units, i.e. 10 plants more than at the end of 2007, with an aggregate gross power of approx. 38 GWe, were under construction in 14 countries end of 2008. New or continued projects are notified from (in brackets: number of new projects): Bulgaria (2), China (5), South Korea (2), Russia (1), and the Slovak Republic (2). Some 80 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another approximately 120 units are in their preliminary project phases. (orig.)
Nuclear power plants: 2004 atw compact statistics
International Nuclear Information System (INIS)
Anon.
2005-01-01
In late 2004, nuclear power plants were available for power supply or were under construction in 32 countries worldwide. A total of 441 nuclear power plants, i.e. two plants more than in late 2003, were in operation with an aggregate gross power of approx. 386 GWe and an aggregate net power, respectively, of 362 GWe, in 31 countries. The available capacity of nuclear power plants increased by approx. 5 GWe as a result of the additions by the six units newly commissioned: Hamaoka 5 (Japan), Ulchin 6 (Korea), Kalinin 3 (Russia), Khmelnitski 2 (Ukraine), Qinshan II-2 (People's Republic of China), and Rowno 4 (Ukraine). In addition, unit 3 of the Bruce A nuclear power plant in Canada with a power of 825 MWe was restarted after an outage of many years. Contrary to earlier plans, a recommissioning program was initiated for the Bruce A-1 and A-2 units, which are also down at present. Five plants were decommissioned for good in 2004; Chapelcross 1 to 4 with 50 MWe each in the United Kingdom, and Ignalina 1 with 1 300 MWe in Lithuania. 22 nuclear generating units with an aggregate gross power of 19 GWe in nine countries were under construction in late 2004. In India, construction work was started on a new project, the 500 MWe PFBR prototype fast breeder reactor. In France, the EDF utility announced its intention to build an EPR on the Flamanville site beginning in 2007. (orig.)
International Nuclear Information System (INIS)
Noble, J.B.; Hemphill, J.B.
1978-03-01
The Department of Energy's Nuclear Siting and Licensing Act of 1978 (S. 2775; H. R. 11704) proposes Federal/State coordination in need for facility decisionmaking for nuclear power stations. The present study examines the decisionmaking criteria used by forty-four States in making a determination of need for power/facility. Specific criteria are identified along with the number of States which make those criteria a primary or a secondary consideration in determining need for facility. Individual profiles of the studied States' decisionmaking criteria are provided. In addition, the study examines the different organizational and functional patterns found in the States' regulatory process to certificate power stations. The coordination or lack of coordination of the issuance of associated environmental permits required for power stations is outlined for each State. Information concerning States' rate treatment of expenses associated with the construction and operation of a power station is provided. The relationship between the need for power decisionmaking process and the ratemaking process is explored
Nuclear power plants: 2005 atw compact statistics
International Nuclear Information System (INIS)
Anon.
2006-01-01
Nuclear power plants were available for power supply and under construction, respectively, in 32 countries of the world as per end of 2005. A total of 444 nuclear power plants, i.e. three plants more than at the end of 2004, with an aggregate gross power of approx. 389 GWe and an aggregate net power of 370 GWe, respectively, were in operation in 31 countries. The available capacity of nuclear power plants increased by some 4,5 GWe as a result of the capacities added by the four newly commissioned units of Higashidori 1 (Japan), Shika 2 (Japan), Tarapur 4 (India), and Tianwan 1 (China). In addition, unit A-1 of the Pickering nuclear power station in Canada, with 825 MWe, was restarted after a downtime of several years. Two plants were decommissioned for good in 2005: Obrigheim in Germany, and Barsebaeck 2 in Sweden. 23 nuclear generating units, i.e. one unit more than in late 2004, with an aggregate gross power of approx. 19 GWe were still under construction in nine countries by late 2005. In Pakistan, construction of a new project, Chasnupp 2, was started; in China, construction was begun of two units, Lingao Phase 2, units 3 and 4, and in Japan, the Shimane 3 generating unit is being built. (orig.)
Hydropower: a vital asset in a power system with increased need for flexibility and firm capacity
International Nuclear Information System (INIS)
Weisrock, Ghislain
2016-02-01
In a power system with increased need for flexibility, wind and solar power are characterised by considerable volatility across different scales and their output cannot be predicted with certainty. In order to deal with the resulting variations and forecast errors, system operators as well as electricity markets will need to have access to increasing volumes of flexibility as the penetration of wind and solar power grows. Due to their flexibility and size, hydropower plants are perfectly suited for supplying these capabilities to current and future electricity markets and power systems. Storage as well as pump storage plants can be quickly started within a few minutes and adjust their output within seconds. Consequently, hydropower plants are able to follow even major variations in real time. (author)
Statistical utility theory for comparison of nuclear versus fossil power plant alternatives
International Nuclear Information System (INIS)
Garribba, S.; Ovi, A.
1977-01-01
A statistical formulation of utility theory is developed for decision problems concerned with the choice among alternative strategies in electric energy production. Four alternatives are considered: nuclear power, fossil power, solar energy, and conservation policy. Attention is focused on a public electric utility thought of as a rational decision-maker. A framework for decisions is then suggested where the admissible strategies and their possible consequences represent the information available to the decision-maker. Once the objectives of the decision process are assessed, consequences can be quantified in terms of measures of effectiveness. Maximum expected utility is the criterion of choice among alternatives. Steps toward expected values are the evaluation of the multidimensional utility function and the assessment of subjective probabilities for consequences. In this respect, the multiplicative form of the utility function seems less restrictive than the additive form and almost as manageable to implement. Probabilities are expressed through subjective marginal probability density functions given at a discrete number of points. The final stage of the decision model is to establish the value of each strategy. To this scope, expected utilities are computed and scaled. The result is that nuclear power offers the best alternative. 8 figures, 9 tables, 32 references
DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases
International Nuclear Information System (INIS)
Davis, W.; Mastrovito, D.
2003-01-01
DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit
David, Gergely; Freund, Patrick; Mohammadi, Siawoosh
2017-09-01
Diffusion tensor imaging (DTI) is a promising approach for investigating the white matter microstructure of the spinal cord. However, it suffers from severe susceptibility, physiological, and instrumental artifacts present in the cord. Retrospective correction techniques are popular approaches to reduce these artifacts, because they are widely applicable and do not increase scan time. In this paper, we present a novel outlier rejection approach (reliability masking) which is designed to supplement existing correction approaches by excluding irreversibly corrupted and thus unreliable data points from the DTI index maps. Then, we investigate how chains of retrospective correction techniques including (i) registration, (ii) registration and robust fitting, and (iii) registration, robust fitting, and reliability masking affect the statistical power of a previously reported finding of lower fractional anisotropy values in the posterior column and lateral corticospinal tracts in cervical spondylotic myelopathy (CSM) patients. While established post-processing steps had small effect on the statistical power of the clinical finding (slice-wise registration: -0.5%, robust fitting: +0.6%), adding reliability masking to the post-processing chain increased it by 4.7%. Interestingly, reliability masking and registration affected the t-score metric differently: while the gain in statistical power due to reliability masking was mainly driven by decreased variability in both groups, registration slightly increased variability. In conclusion, reliability masking is particularly attractive for neuroscience and clinical research studies, as it increases statistical power by reducing group variability and thus provides a cost-efficient alternative to increasing the group size. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Nuclear power plants: 2013 atw compact statistics
Energy Technology Data Exchange (ETDEWEB)
Anon.
2014-03-15
At the end of 2013, nuclear power plants were available for energy supply in 31 countries of the world. A total of 437 nuclear power plants were in operation with an aggregate gross power of approx. 393 GWe and an aggregate net power, respectively, of 372 GWe. This means that the number was unchanged compared to the previous year's number on 31 December 2012. The available gross power of nuclear power plants increased by approx. 2 GWe from 2012 to the end of 2013. In total 4 nuclear generating units were commissioned in 2013 in China (+2) and in the Republic Korea (+1). 6 nuclear generating units were decommissioned in 2013. Four units in the U.S.A. (-4) were shut down due to economical reasons. In Canada (-2) the operation status of 2 units was changed from long-term shutdown to permanently shutdown. 70 nuclear generating units with an aggregate gross power of approx. 73 GWe, were under construction in 15 countries end of 2013. New or continued projects are notified from (in brackets: number of new projects) China (+3), Belarus (+1), Rep. of Korea (+1) and the United Arab Emirates (+1). Some 115 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another units are in their preliminary project phases. (orig.)
Nuclear power plants: 2013 atw compact statistics
International Nuclear Information System (INIS)
Anon.
2014-01-01
At the end of 2013, nuclear power plants were available for energy supply in 31 countries of the world. A total of 437 nuclear power plants were in operation with an aggregate gross power of approx. 393 GWe and an aggregate net power, respectively, of 372 GWe. This means that the number was unchanged compared to the previous year's number on 31 December 2012. The available gross power of nuclear power plants increased by approx. 2 GWe from 2012 to the end of 2013. In total 4 nuclear generating units were commissioned in 2013 in China (+2) and in the Republic Korea (+1). 6 nuclear generating units were decommissioned in 2013. Four units in the U.S.A. (-4) were shut down due to economical reasons. In Canada (-2) the operation status of 2 units was changed from long-term shutdown to permanently shutdown. 70 nuclear generating units with an aggregate gross power of approx. 73 GWe, were under construction in 15 countries end of 2013. New or continued projects are notified from (in brackets: number of new projects) China (+3), Belarus (+1), Rep. of Korea (+1) and the United Arab Emirates (+1). Some 115 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another units are in their preliminary project phases. (orig.)
The power and statistical behaviour of allele-sharing statistics when ...
Indian Academy of Sciences (India)
Unknown
3Human Genetics Division, School of Medicine, University of Southampton, Southampton SO16 6YD, UK. Abstract ... that the statistic S-#alleles gives good performance for recessive ... (H50) of the families are linked to the single marker. The.
Statistical analysis of wind power in the region of Veracruz (Mexico)
Energy Technology Data Exchange (ETDEWEB)
Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)
2009-06-15
The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)
Statistical analysis of wind power in the region of Veracruz (Mexico)
International Nuclear Information System (INIS)
Cancino-Solorzano, Yoreley; Xiberta-Bernat, Jorge
2009-01-01
The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)
Directory of Open Access Journals (Sweden)
Frantál Bohumil
2016-03-01
Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.
Research needs and improvement of standards for nuclear power plant design
International Nuclear Information System (INIS)
Chen, C.; Moreadith, F.L.
1978-01-01
The need for research and improvement of code requirements, for both economy and safety reasons is discussed for the following topics relevant to nuclear power plant structural analysis: Earthquake definition; dynamic behavior of reinforced concrete structures under impact loads; design for postulated pipe rupture; code requirements for loading combinations for concrete structures, reinforcing steel splicing, reinforced concrete structural design for thermal effects. (Auth.)
Austin, Peter C; Schuster, Tibor; Platt, Robert W
2015-10-15
Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.
Directory of Open Access Journals (Sweden)
Abul Kalam Azad
2014-05-01
Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.
Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.
Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P
2017-08-23
Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially
DEFF Research Database (Denmark)
Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery
2009-01-01
In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...
Wind Turbine Power Curves Incorporating Turbulence Intensity
DEFF Research Database (Denmark)
Sørensen, Emil Hedevang Lohse
2014-01-01
. The model and method are parsimonious in the sense that only a single function (the zero-turbulence power curve) and a single auxiliary parameter (the equivalent turbulence factor) are needed to predict the mean power at any desired turbulence intensity. The method requires only ten minute statistics......The performance of a wind turbine in terms of power production (the power curve) is important to the wind energy industry. The current IEC-61400-12-1 standard for power curve evaluation recognizes only the mean wind speed at hub height and the air density as relevant to the power production...
Energy Technology Data Exchange (ETDEWEB)
Miller, R.J.; Najaf-Zadeh, K.; Darlington, H.T.; McNair, H.D.; Seidenstein, S.; Williams, A.R.
1982-10-01
Human factors is a systems-oriented interdisciplinary specialty concerned with the design of systems, equipment, facilities and the operational environment. An important aspect leading to the design requirements is the determination of the information requirements for electric power dispatch control centers. There are significant differences between the system operator's actions during normal and degraded states of power system operation, and power system restoration. This project evaluated the information the operator requires for normal power system and control system operations and investigates the changes of information required by the operator as the power system and/or the control system degrades from a normal operating state. The Phase II study, published in two volumes, defines power system states and control system conditions to which operator information content can be related. This volume presents detailed data concerning operator information needs that identify the needs for and the uses of power system information by a system operator in conditions ranging from normal through degraded operation. The study defines power system states and control system conditions to which operator information content can be related, and it identifies the requisite information as consistent with current industry practice so as to aid control system designers. Training requirements are also included for planning entry-level and follow-on training for operators.
Energy Technology Data Exchange (ETDEWEB)
Tolbert, L.M.
2005-12-21
Power electronics can provide utilities the ability to more effectively deliver power to their customers while providing increased reliability to the bulk power system. In general, power electronics is the process of using semiconductor switching devices to control and convert electrical power flow from one form to another to meet a specific need. These conversion techniques have revolutionized modern life by streamlining manufacturing processes, increasing product efficiencies, and increasing the quality of life by enhancing many modern conveniences such as computers, and they can help to improve the delivery of reliable power from utilities. This report summarizes the technical challenges associated with utilizing power electronics devices across the entire spectrum from applications to manufacturing and materials development, and it provides recommendations for research and development (R&D) needs for power electronics systems in which the U.S. Department of Energy (DOE) could make a substantial impact toward improving the reliability of the bulk power system.
Nuclear power plant security systems - The need for upgrades
International Nuclear Information System (INIS)
Murskyj, M.P.; Furlow, C.H.
1989-01-01
Most perimeter security systems for nuclear power plants were designed and installed in the late 1970s or early 1980s. This paper explores the need to regularly evaluate and possibly upgrade a security system in the area of perimeter intrusion detection and surveillance. this paper discusses US Nuclear Regulatory Commission audits and regulatory effectiveness reviews (RERs), which have raised issues regarding the performance of perimeter security systems. The audits and RERs identified various degrees of vulnerability in certain aspects of existing perimeter security systems. In addition to reviewing the regulatory concerns, this paper discusses other reasons to evaluate and/or upgrade a perimeter security system
A powerful score-based test statistic for detecting gene-gene co-association.
Xu, Jing; Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Li, Hongkai; Wu, Xuesen; Xue, Fuzhong; Liu, Yanxun
2016-01-29
The genetic variants identified by Genome-wide association study (GWAS) can only account for a small proportion of the total heritability for complex disease. The existence of gene-gene joint effects which contains the main effects and their co-association is one of the possible explanations for the "missing heritability" problems. Gene-gene co-association refers to the extent to which the joint effects of two genes differ from the main effects, not only due to the traditional interaction under nearly independent condition but the correlation between genes. Generally, genes tend to work collaboratively within specific pathway or network contributing to the disease and the specific disease-associated locus will often be highly correlated (e.g. single nucleotide polymorphisms (SNPs) in linkage disequilibrium). Therefore, we proposed a novel score-based statistic (SBS) as a gene-based method for detecting gene-gene co-association. Various simulations illustrate that, under different sample sizes, marginal effects of causal SNPs and co-association levels, the proposed SBS has the better performance than other existed methods including single SNP-based and principle component analysis (PCA)-based logistic regression model, the statistics based on canonical correlations (CCU), kernel canonical correlation analysis (KCCU), partial least squares path modeling (PLSPM) and delta-square (δ (2)) statistic. The real data analysis of rheumatoid arthritis (RA) further confirmed its advantages in practice. SBS is a powerful and efficient gene-based method for detecting gene-gene co-association.
S.B. Doeze Jager-van Vliet (Sandra); M.Ph. Born (Marise); H.T. van der Molen (Henk)
2017-01-01
textabstractThe present study focused on self-other agreement between employees on their Need for Achievement, Need for Power and Need for Affiliation, which needs are relevant for performance and wellbeing at work. The Social Relations Model was used to examine consensus between other-raters,
Energy Technology Data Exchange (ETDEWEB)
Miller, R.J.; Najaf-Zadeh, K.; Darlington, H.T.; McNair, H.D.; Seidenstein, S.; Williams, A.R.
1982-10-01
Human factors is a systems-oriented interdisciplinary specialty concerned with the design of systems, equipment, facilities and the operational environment. An important aspect leading to the design requirements is the determination of the information requirements for electric power dispatch control centers. There are significant differences between the system operator's actions during normal and degraded states of power system operation, and power system restoration. This project evaluated the information the operator requires for normal power system and control system operations and investigates the changes of information required by the operator as the power system and/or the control system degrades from a normal operating state. The Phase II study, published in two volumes, defines power system states and control system conditions to which operator information content can be related. This volume presents a summary of operator information needs, identifying the needs for and the uses of power system information by a system operator in conditions ranging from normal through degraded operation. Training requirements are also included for planning entry-level and follow-on training for operators.
Efficient evaluation of angular power spectra and bispectra
Assassi, Valentin; Simonović, Marko; Zaldarriaga, Matias
2017-11-01
Angular statistics of cosmological observables are hard to compute. The main difficulty is due to the presence of highly-oscillatory Bessel functions which need to be integrated over. In this paper, we provide a simple and fast method to compute the angular power spectrum and bispectrum of any observable. The method is based on using an FFTlog algorithm to decompose the momentum-space statistics onto a basis of power-law functions. For each power law, the integrals over Bessel functions have a simple analytical solution. This allows us to efficiently evaluate these integrals, independently of the value of the multipole l. In particular, this method significantly speeds up the evaluation of the angular bispectrum compared to existing methods. To illustrate our algorithm, we compute the galaxy, lensing and CMB temperature angular power spectrum and bispectrum.
Wind energy in electric power production. Preliminary study
Energy Technology Data Exchange (ETDEWEB)
Lento, R; Peltola, E
1984-01-15
The wind speed conditions in Finland have been studied with the aid of the existing statistics of the Finnish Meteorological Institute. With the aid of the statistics also estimates on the available wind energy were made. 800 wind power plants, 1.5 MW each, on the windiest west coast would produce about 2 TWh energy per year. Far more information on the temporal, geographical and vertical distribution of the wind speed than the present statistics include is needed when the available wind energy is estimated, when wind power plants are dimensioned optimally, and when suitable locations are chosen for them. The investment costs of a wind power plant increase when the height of the tower or the diameter of the rotor is increased, but the energy production increases, too. Thus, overdimensioning the wind power plant in view of energy needs or the wind conditions causes extra costs. The cost of energy produced by wind power can not yet compete with conventional energy, but the situation changes to the advantage of wind energy, if the real price of the plants decreases (among other things due to large series production and increasing experience), or if the real price of fuels rises. The inconvinience on the environment caused by the wind power plants is considered insignificant. The noise caused by the plant attenuates rapidly with distance. No harmful effects birds and other animals caused by the wind power plants have been observed in the studies made abroad. Parts of a plant getting loose during an accident, or ice forming on the blades are estimated to fly even from a large plant only a few hundred meters.
Statistical Multipath Model Based on Experimental GNSS Data in Static Urban Canyon Environment
Directory of Open Access Journals (Sweden)
Yuze Wang
2018-04-01
Full Text Available A deep understanding of multipath characteristics is essential to design signal simulators and receivers in global navigation satellite system applications. As a new constellation is deployed and more applications occur in the urban environment, the statistical multipath models of navigation signal need further study. In this paper, we present statistical distribution models of multipath time delay, multipath power attenuation, and multipath fading frequency based on the experimental data in the urban canyon environment. The raw data of multipath characteristics are obtained by processing real navigation signal to study the statistical distribution. By fitting the statistical data, it shows that the probability distribution of time delay follows a gamma distribution which is related to the waiting time of Poisson distributed events. The fading frequency follows an exponential distribution, and the mean of multipath power attenuation decreases linearly with an increasing time delay. In addition, the detailed statistical characteristics for different elevations and orbits satellites is studied, and the parameters of each distribution are quite different. The research results give useful guidance for navigation simulator and receiver designers.
A statistical model for field emission in superconducting cavities
International Nuclear Information System (INIS)
Padamsee, H.; Green, K.; Jost, W.; Wright, B.
1993-01-01
A statistical model is used to account for several features of performance of an ensemble of superconducting cavities. The input parameters are: the number of emitters/area, a distribution function for emitter β values, a distribution function for emissive areas, and a processing threshold. The power deposited by emitters is calculated from the field emission current and electron impact energy. The model can successfully account for the fraction of tests that reach the maximum field Epk in an ensemble of cavities, for eg, 1-cells at sign 3 GHz or 5-cells at sign 1.5 GHz. The model is used to predict the level of power needed to successfully process cavities of various surface areas with high pulsed power processing (HPP)
IAEA releases nuclear power statistics for 2000
International Nuclear Information System (INIS)
2001-01-01
According to data reported to the IAEA Power Reactor Information System, a total of 438 NPPs were operating around the world at the end of 2000. The total installed power from NPPs was 351 GWe. During 2000, six plants were connected to the grid, construction of three new nuclear reactors started, bringing the total number of reactors under construction to 31. Worldwide in 2000, total nuclear generated electricity increased to 2447.53 terawatt-hours. Cumulative worldwide operating experience from civil nuclear power reactors at the end of 2000 exceeded 9800 reactor years
GC Side Event: Africa's Energy Needs and the Potential Role of Nuclear Power. Presentations
International Nuclear Information System (INIS)
2015-01-01
Energy is central to development, and energy availability, accessibility and affordability are central challenges for most African countries. Due to rapidly growing energy demand, the need for socioeconomic development, persistent concerns over climate change and environmental impacts and dependence on imported supplies of fossil fuels, African Member States are looking into possible options to secure sustainable energy supplies, including nuclear energy. The IAEA assists those countries in assessing the nuclear power option and building the necessary infrastructure for a safe, secure and sustainable nuclear power programme. This year, the IAEA is conducting Integrated Nuclear Infrastructure Review (INIR) missions to three African countries (Nigeria, Kenya and Morocco) considering introducing nuclear power. The side event presents recent updates from Africa on the potential role of nuclear power, including the IAEA Third Regional Conference on Energy and Nuclear Power in Africa, held in Mombasa, Kenya, in April 2015, an initiative to launch a new African network for enhancing nuclear power programme development, and others. The event reports on recent developments in several African Member States considering, embarking on, or expanding national nuclear power programmes.
Statistical models for thermal ageing of steel materials in nuclear power plants
International Nuclear Information System (INIS)
Persoz, M.
1996-01-01
Some category of steel materials in nuclear power plants may be subjected to thermal ageing, whose extent depends on the steel chemical composition and the ageing parameters, i.e. temperature and duration. This ageing affects the 'impact strength' of the materials, which is a mechanical property. In order to assess the residual lifetime of these components, a probabilistic study has been launched, which takes into account the scatter over the input parameters of the mechanical model. Predictive formulae for estimating the impact strength of aged materials are important input data of the model. A data base has been created with impact strength results obtained from an ageing program in laboratory and statistical treatments have been undertaken. Two kinds of model have been developed, with non linear regression methods (PROC NLIN, available in SAS/STAT). The first one, using a hyperbolic tangent function, is partly based on physical considerations, and the second one, of an exponential type, is purely statistically built. The difficulties consist in selecting the significant parameters and attributing initial values to the coefficients, which is a requirement of the NLIN procedure. This global statistical analysis has led to general models that are unction of the chemical variables and the ageing parameters. These models are as precise (if not more) as local models that had been developed earlier for some specific values of ageing temperature and ageing duration. This paper describes the data and the methodology used to build the models and analyses the results given by the SAS system. (author)
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
Sparse Power-Law Network Model for Reliable Statistical Predictions Based on Sampled Data
Directory of Open Access Journals (Sweden)
Alexander P. Kartun-Giles
2018-04-01
Full Text Available A projective network model is a model that enables predictions to be made based on a subsample of the network data, with the predictions remaining unchanged if a larger sample is taken into consideration. An exchangeable model is a model that does not depend on the order in which nodes are sampled. Despite a large variety of non-equilibrium (growing and equilibrium (static sparse complex network models that are widely used in network science, how to reconcile sparseness (constant average degree with the desired statistical properties of projectivity and exchangeability is currently an outstanding scientific problem. Here we propose a network process with hidden variables which is projective and can generate sparse power-law networks. Despite the model not being exchangeable, it can be closely related to exchangeable uncorrelated networks as indicated by its information theory characterization and its network entropy. The use of the proposed network process as a null model is here tested on real data, indicating that the model offers a promising avenue for statistical network modelling.
Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads
Energy Technology Data Exchange (ETDEWEB)
Olama, Mohammed M [ORNL; Sharma, Isha [ORNL; Kuruganti, Teja [ORNL; Fugate, David L [ORNL
2017-01-01
In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis of building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.
Gaskin, Cadeyrn J; Happell, Brenda
2013-02-01
Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.
Dark matter statistics for large galaxy catalogs: power spectra and covariance matrices
Klypin, Anatoly; Prada, Francisco
2018-06-01
Large-scale surveys of galaxies require accurate theoretical predictions of the dark matter clustering for thousands of mock galaxy catalogs. We demonstrate that this goal can be achieve with the new Parallel Particle-Mesh (PM) N-body code GLAM at a very low computational cost. We run ˜22, 000 simulations with ˜2 billion particles that provide ˜1% accuracy of the dark matter power spectra P(k) for wave-numbers up to k ˜ 1hMpc-1. Using this large data-set we study the power spectrum covariance matrix. In contrast to many previous analytical and numerical results, we find that the covariance matrix normalised to the power spectrum C(k, k΄)/P(k)P(k΄) has a complex structure of non-diagonal components: an upturn at small k, followed by a minimum at k ≈ 0.1 - 0.2 hMpc-1, and a maximum at k ≈ 0.5 - 0.6 hMpc-1. The normalised covariance matrix strongly evolves with redshift: C(k, k΄)∝δα(t)P(k)P(k΄), where δ is the linear growth factor and α ≈ 1 - 1.25, which indicates that the covariance matrix depends on cosmological parameters. We also show that waves longer than 1h-1Gpc have very little impact on the power spectrum and covariance matrix. This significantly reduces the computational costs and complexity of theoretical predictions: relatively small volume ˜(1h-1Gpc)3 simulations capture the necessary properties of dark matter clustering statistics. As our results also indicate, achieving ˜1% errors in the covariance matrix for k < 0.50 hMpc-1 requires a resolution better than ɛ ˜ 0.5h-1Mpc.
2017-12-08
STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found
Sapundzhiev, M.; Evtimov, I.; Ivanov, R.
2017-10-01
The paper presents an upgraded methodology for determination of the electric motor power considering the time for acceleration. The influence of the speed factor of electric motor on the value of needed power at same acceleration time is studied. Some calculations on the basis of real vehicle were made. The numeric and graphical results are given. They show a decrease of needed power with the increase of the speed factor of motor, because the high speed factor allows the use of a larger range of the characteristic with the maximum power of the motor. An experimental verification of methodology was done.
International Nuclear Information System (INIS)
Samanta, P.K.; Teichmann, T.
1990-01-01
In this paper, a multivariate statistical method is presented and demonstrated as a means for analyzing nuclear power plant transients (or events) and safety system performance for detection of malfunctions and degradations within the course of the event based on operational data. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to detect failure trends and patterns and so can lead to prevention of conditions with serious safety implications
International Nuclear Information System (INIS)
2003-01-01
The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
A Study of Distance Education for the Needs of the Nuclear Power Industry
Reckline, Sigmund Joseph
2010-01-01
This research presents an examination of student satisfaction related to online training for adult learners in the nuclear power industry. Both groups, the nuclear industry and its associated workforce, have demonstrable needs which might be met by such programs. The nuclear industry itself faces an expansion of facilities and services combined…
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Excel 2007 for Business Statistics A Guide to Solving Practical Business Problems
Quirk, Thomas J
2012-01-01
This is the first book to show the capabilities of Microsoft Excel to teach business statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical business problems. If understanding statistics isn't your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in business courses. Its powerful computat
Nuclear power plant insurance - experience and loss statistics
International Nuclear Information System (INIS)
Feldmann, J.; Dangelmaier, P.
1982-01-01
Nuclear power plants are treated separately when concluding insurance contracts. National insurance pools have been established in industrial countries, co-operating on an international basis, for insuring a nuclear power plant. In combined property insurance, the nuclear risk is combined with the fire risk. In addition, there are the engineering insurances. Of these, the one of significance for nuclear power plants is the machinery insurance, which can be covered on the free insurance market. Nuclear power plants have had fewer instances of damage than other, conventional installations. (orig.) [de
The future of nuclear power in Europe
International Nuclear Information System (INIS)
Kurtz, D.
1996-01-01
The current and future prospects of the nuclear power industry in Europe are assessed in this Financial Times Energy Publishing report. Key issues relating to the development of the industry in both Eastern and Western Europe are addressed. Changing governmental and popular attitudes to nuclear power are described and nuclear energy's likely future contribution to Europe's energy needs is discussed. Detailed production and consumption statistics make the document useful reading for those in nuclear generating companies, electric utilities, major power consumers, waste management companies, governments, regulatory bodies, investors and environmental groups amongst others. (UK)
Quality of statistical reporting in developmental disability journals.
Namasivayam, Aravind K; Yan, Tina; Wong, Wing Yiu Stephanie; van Lieshout, Pascal
2015-12-01
Null hypothesis significance testing (NHST) dominates quantitative data analysis, but its use is controversial and has been heavily criticized. The American Psychological Association has advocated the reporting of effect sizes (ES), confidence intervals (CIs), and statistical power analysis to complement NHST results to provide a more comprehensive understanding of research findings. The aim of this paper is to carry out a sample survey of statistical reporting practices in two journals with the highest h5-index scores in the areas of developmental disability and rehabilitation. Using a checklist that includes critical recommendations by American Psychological Association, we examined 100 randomly selected articles out of 456 articles reporting inferential statistics in the year 2013 in the Journal of Autism and Developmental Disorders (JADD) and Research in Developmental Disabilities (RDD). The results showed that for both journals, ES were reported only half the time (JADD 59.3%; RDD 55.87%). These findings are similar to psychology journals, but are in stark contrast to ES reporting in educational journals (73%). Furthermore, a priori power and sample size determination (JADD 10%; RDD 6%), along with reporting and interpreting precision measures (CI: JADD 13.33%; RDD 16.67%), were the least reported metrics in these journals, but not dissimilar to journals in other disciplines. To advance the science in developmental disability and rehabilitation and to bridge the research-to-practice divide, reforms in statistical reporting, such as providing supplemental measures to NHST, are clearly needed.
Number of patients needed to discriminate between subgroups in patient reported outcome measures
DEFF Research Database (Denmark)
Paulsen, Aksel
2011-01-01
analysis of variance. The hypothetical number of subjects needed to find the significant difference in PRO mean value between groups (assuming a significance level of 5 % and a power of 85 % to detect differences between the actual groups in our current study) was estimated for each PRO subscales...... with sample size calculations or by power calculations and simulated ANOVA F tests, depending on the number of groups. Results: To discriminate between gender, the least number needed to find a statistically significant difference in mean sum score in each group was 298 (OHS) while HOOS QoL required the most...... number of subjects (760 in each group). PCS had the least number needed in relation to diagnoses (51 patients per group needed), while HOOS Pain required the most (116 patients per group needed). Concerning age, the least number needed was 270 (EQ-VAS), and OHS required the most (1566 in each group...
Trends, challenges and opportunities in power quality research
Bollen, M.H.J.; Ribeiro, P.F.; Gu, I.Y.H.; Duque, C.A.
2009-01-01
This paper outlines a number of possible research directions in power quality. The introduction of new sources of generation will introduce the need for new research on voltage–magnitude variations, harmonic emission and harmonic resonance. Statistical performance indicators are expected to play an
Short-Term Wind Speed Forecasting for Power System Operations
Zhu, Xinxin
2012-04-01
The emphasis on renewable energy and concerns about the environment have led to large-scale wind energy penetration worldwide. However, there are also significant challenges associated with the use of wind energy due to the intermittent and unstable nature of wind. High-quality short-term wind speed forecasting is critical to reliable and secure power system operations. This article begins with an overview of the current status of worldwide wind power developments and future trends. It then reviews some statistical short-term wind speed forecasting models, including traditional time series approaches and more advanced space-time statistical models. It also discusses the evaluation of forecast accuracy, in particular, the need for realistic loss functions. New challenges in wind speed forecasting regarding ramp events and offshore wind farms are also presented. © 2012 The Authors. International Statistical Review © 2012 International Statistical Institute.
Excel 2016 for engineering statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching engineering statistics effectively. Similar to the previously published Excel 2013 for Engineering Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However,Excel 2016 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and...
Excel 2016 for business statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching business statistics effectively. Similar to the previously published Excel 2010 for Business Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical business problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in business courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Business Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each ch...
Excel 2016 for marketing statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This is the first book to show the capabilities of Microsoft Excel in teaching marketing statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical marketing problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in marketing courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Marketing Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader t...
Excel 2013 for engineering statistics a guide to solving practical problems
Quirk, Thomas J
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach engineering statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs...
Statistics for products of traces of high powers of the frobenius class of hyperelliptic curves
Roditty-Gershon, Edva
2011-01-01
We study the averages of products of traces of high powers of the Frobenius class of hyperelliptic curves of genus g over a fixed finite field. We show that for increasing genus g, the limiting expectation of these products equals to the expectation when the curve varies over the unitary symplectic group USp(2g). We also consider the scaling limit of linear statistics for eigenphases of the Frobenius class of hyperelliptic curves, and show that their first few moments are Gaussian.
Zhu, Zhaozhong; Anttila, Verneri; Smoller, Jordan W; Lee, Phil H
2018-01-01
Advances in recent genome wide association studies (GWAS) suggest that pleiotropic effects on human complex traits are widespread. A number of classic and recent meta-analysis methods have been used to identify genetic loci with pleiotropic effects, but the overall performance of these methods is not well understood. In this work, we use extensive simulations and case studies of GWAS datasets to investigate the power and type-I error rates of ten meta-analysis methods. We specifically focus on three conditions commonly encountered in the studies of multiple traits: (1) extensive heterogeneity of genetic effects; (2) characterization of trait-specific association; and (3) inflated correlation of GWAS due to overlapping samples. Although the statistical power is highly variable under distinct study conditions, we found the superior power of several methods under diverse heterogeneity. In particular, classic fixed-effects model showed surprisingly good performance when a variant is associated with more than a half of study traits. As the number of traits with null effects increases, ASSET performed the best along with competitive specificity and sensitivity. With opposite directional effects, CPASSOC featured the first-rate power. However, caution is advised when using CPASSOC for studying genetically correlated traits with overlapping samples. We conclude with a discussion of unresolved issues and directions for future research.
Do we need a power exchange if there are enough power marketers ?
SMEERS, Yves; WEI, Jing-Yuan
1997-01-01
Decentralization in electricity restructuring is a growing trend that Power Marketers are ex- pected to take advantage of. We consider a market composed of Power Marketers, an Indepen- dent System Operator, generators and retailers. Power Marketers behave a` la Cournot-Nash and the ISO implements a Transmission Capacity Reservation market a` la FERC. Retailers are price taker. Generators’ behavior is only reflected in the purchase costs of the Power Marketers. Their behavior is thus not reall...
Modern applied statistics with S-plus
Venables, W N
1994-01-01
S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statistical analyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.
Fraley, R. Chris; Vazire, Simine
2014-01-01
The authors evaluate the quality of research reported in major journals in social-personality psychology by ranking those journals with respect to their N-pact Factors (NF)—the statistical power of the empirical studies they publish to detect typical effect sizes. Power is a particularly important attribute for evaluating research quality because, relative to studies that have low power, studies that have high power are more likely to (a) to provide accurate estimates of effects, (b) to produce literatures with low false positive rates, and (c) to lead to replicable findings. The authors show that the average sample size in social-personality research is 104 and that the power to detect the typical effect size in the field is approximately 50%. Moreover, they show that there is considerable variation among journals in sample sizes and power of the studies they publish, with some journals consistently publishing higher power studies than others. The authors hope that these rankings will be of use to authors who are choosing where to submit their best work, provide hiring and promotion committees with a superior way of quantifying journal quality, and encourage competition among journals to improve their NF rankings. PMID:25296159
International Nuclear Information System (INIS)
Arkoma, Asko; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta
2015-01-01
Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)
Energy Technology Data Exchange (ETDEWEB)
Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta
2015-04-15
Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)
Doeze Jager-van Vliet, Sandra; Born, Marise; Molen, Henk
2017-01-01
textabstractThe present study focused on self-other agreement between employees on their Need for Achievement, Need for Power and Need for Affiliation, which needs are relevant for performance and wellbeing at work. The Social Relations Model was used to examine consensus between other-raters, self-other agreement and assumed similarity (seeing others as one sees oneself) on these needs. Data were collected among 168 employees from a Dutch non-profit organization, with four employees in each ...
International Nuclear Information System (INIS)
Wallin, K.; Voskamp, R.; Schmibauer, J.; Ostermeyer, H.; Nagel, G.
2011-01-01
The cost of steam generator inspections in nuclear power plants is high. A new quantitative assessment methodology for the accumulation of flaws due to stochastic causes like fretting has been developed for cases where limited inspection data is available. Additionally, a new quantitative assessment methodology for the accumulation of environment related flaws, caused e.g. by corrosion in steam generator tubes, has been developed. The method that combines deterministic information regarding flaw initiation and growth with stochastic elements connected to environmental aspects requires only knowledge of the experimental flaw accumulation history. The method, combining both types of flaw types, provides a complete description of the flaw accumulation and there are several possible uses of the method. The method can be used to evaluate the total life expectancy of the steam generator and simple statistically defined plugging criteria can be established based on flaw behaviour. This way the inspection interval and inspection coverage can be optimized with respect to allowable flaws and the method can recognize flaw type subsets requiring more frequent inspection intervals. The method can also be used to develop statistically realistic safety factors accounting for uncertainties in inspection flaw sizing and detection. The statistical assessment method has been showed to be robust and insensitive to different assessments of plugged tubes. Because the procedure is re-calibrated after each inspection, it reacts effectively to possible changes in the steam generator environment. Validation of the assessment method is provided for real steam generators, both in the case of stochastic damage as well as environment related flaws. (authors)
Quantity and quality in nuclear engineering professional skills needed by the nuclear power industry
International Nuclear Information System (INIS)
Slember, R.J.
1990-01-01
This paper examines the challenge of work force requirements in the context of the full range of issues facing the nuclear power industry. The supply of skilled managers and workers may be a more serious problem if nuclear power fades away than if it is reborn in a new generation. An even greater concern, however, is the quality of education that the industry needs in all its future professionals. Both government and industry should be helping universities adapt their curricula to the needs of the future. This means building a closer relationship with schools that educate nuclear professionals, that is, providing adequate scholarships and funding for research and development programs, offering in-kind services, and encouraging internships and other opportunities for hands-on experience. The goal should not be just state-of-the-art engineering practices, but the broad range of knowledge, issues, and skills that will be required of the nuclear leadership of the twenty-first century
The Power of Information: Where it's Needed, When it's Needed, To Those Who Need It
National Research Council Canada - National Science Library
2006-01-01
.... Ensuring timely and trusted information is available where it is needed, when it is needed, and to those who need it is at the heart of the capability needed to conduct Net-Centric Operations (NCO...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
Evaluation of Human Resource Needs for a New Nuclear Power Plant: Armenian Case Study
International Nuclear Information System (INIS)
2011-05-01
Rising expectations of an increased role for nuclear power in providing energy for future national and global sustainable development have become a reality in many Member States of the IAEA. Over the last several years, dozens of Member States have announced plans to embark on or expand nuclear power programmes. Reflecting on these developments, the IAEA has adjusted its priorities to focus more on the nuclear power programmes of newcomers. Specifically, the IAEA has produced publications providing guidance on the development of a national infrastructure for nuclear power (IAEA Nuclear Energy Series No. NG-G-3.1) and on managing human resources in the field of nuclear energy (IAEA Nuclear Energy Series No. NG-G-2.1). Additionally, assistance to eligible Member States through new technical cooperation (TC) projects has been increased, including direct support through on-site assist visits. In 2007-2008, the IAEA carried out a TC project titled 'Feasibility study of nuclear energy development in Armenia: Evaluation of human resource needs in conjunction with new NPP build' (ARM-005). The project analysed the human resource demands required to support work at all stages of the life cycle of a new power unit planned for Armenia. This included drafting proposals for the means, conditions and requirements for development of human resource capabilities needed to carry out the work. This report is intended to complement the previous IAEA publications by providing an in-depth technical consideration into this critical area of human resource development. The report summarizes major findings of the TC project and details the tasks linked to management of the human resources that will be required by a country planning to build a new NPP. Additional guidance on the development of a national nuclear infrastructure can be found in the IAEA publication 'Milestones in the Development of a National Infrastructure for Nuclear Power', IAEA Nuclear Energy Series No. NG-G-3.1. The
The Development of On-Line Statistics Program for Radiation Oncology
International Nuclear Information System (INIS)
Kim, Yoon Jong; Lee, Dong Hoon; Ji, Young Hoon; Lee, Dong Han; Jo, Chul Ku; Kim, Mi Sook; Ru, Sung Rul; Hong, Seung Hong
2001-01-01
Purpose : By developing on-line statistics program to record the information of radiation oncology to share the information with internet. It is possible to supply basic reference data for administrative plans to improve radiation oncology. Materials and methods : The information of radiation oncology statistics had been collected by paper forms about 52 hospitals in the past. Now, we can input the data by internet web browsers. The statistics program used windows NT 4.0 operation system, Internet Information Server 4.0 (IIS4.0) as a web server and the Microsoft Access MDB. We used Structured Query Language (SQL), Visual Basic, VBScript and JAVAScript to display the statistics according to years and hospitals. Results : This program shows present conditions about man power, research, therapy machines, technic, brachytherapy, clinic statistics, radiation safety management, institution, quality assurance and radioisotopes in radiation oncology department. The database consists of 38 inputs and 6 outputs windows. Statistical output windows can be increased continuously according to user need. Conclusion : We have developed statistics program to process all of the data in department of radiation oncology for reference information. Users easily could input the data by internet web browsers and share the information
Excel 2013 for physical sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching physical sciences statistics effectively. Similar to the previously published Excel 2010 for Physical Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their ...
Excel 2013 for social sciences statistics a guide to solving practical problems
Quirk, Thomas J
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach social science statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical social science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in social science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Social Science Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formul...
Excel 2016 for social science statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching social science statistics effectively. Similar to the previously published Excel 2013 for Social Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical social science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in social science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Social Science Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in ...
MIDAS: Regionally linear multivariate discriminative statistical mapping.
Varol, Erdem; Sotiras, Aristeidis; Davatzikos, Christos
2018-07-01
Statistical parametric maps formed via voxel-wise mass-univariate tests, such as the general linear model, are commonly used to test hypotheses about regionally specific effects in neuroimaging cross-sectional studies where each subject is represented by a single image. Despite being informative, these techniques remain limited as they ignore multivariate relationships in the data. Most importantly, the commonly employed local Gaussian smoothing, which is important for accounting for registration errors and making the data follow Gaussian distributions, is usually chosen in an ad hoc fashion. Thus, it is often suboptimal for the task of detecting group differences and correlations with non-imaging variables. Information mapping techniques, such as searchlight, which use pattern classifiers to exploit multivariate information and obtain more powerful statistical maps, have become increasingly popular in recent years. However, existing methods may lead to important interpretation errors in practice (i.e., misidentifying a cluster as informative, or failing to detect truly informative voxels), while often being computationally expensive. To address these issues, we introduce a novel efficient multivariate statistical framework for cross-sectional studies, termed MIDAS, seeking highly sensitive and specific voxel-wise brain maps, while leveraging the power of regional discriminant analysis. In MIDAS, locally linear discriminative learning is applied to estimate the pattern that best discriminates between two groups, or predicts a variable of interest. This pattern is equivalent to local filtering by an optimal kernel whose coefficients are the weights of the linear discriminant. By composing information from all neighborhoods that contain a given voxel, MIDAS produces a statistic that collectively reflects the contribution of the voxel to the regional classifiers as well as the discriminative power of the classifiers. Critically, MIDAS efficiently assesses the
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
The need for nuclear power at the Cape
International Nuclear Information System (INIS)
Myburgh, R.P.A.
1980-01-01
The paper gives an indication of the growth of electrical power usage in the Western Cape. In planning to increase the supply of electricity to the Western Cape several factors had to be taken into account. It appeared that the cost to construct a nuclear power plant and the generation of nuclear power compares well with other methods of power generation. Before the construction of Koeberg was started, extensive investigations were undertaken to find a suitable site. Factors taken into account in the investigation included population density, geology and transport facilities. The safety of nuclear power plants are discussed. Mention is made of safeguards inherent in the design of a nuclear power plant. It appears that the possibility of radioactive effluent reaching the atmosphere as result of an accident or malfunctioning of a nuclear plant is very small as there are various safety systems designed to prevent it. Radioactive waste disposal is also discussed
Advanced Rankine and Brayton cycle power systems: Materials needs and opportunities
Grisaffe, S. J.; Guentert, D. C.
1974-01-01
Conceptual advanced potassium Rankine and closed Brayton power conversion cycles offer the potential for improved efficiency over steam systems through higher operating temperatures. However, for utility service of at least 100,000 hours, materials technology advances will be needed for such high temperature systems. Improved alloys and surface protection must be developed and demonstrated to resist coal combustion gases as well as potassium corrosion or helium surface degradation at high temperatures. Extensions in fabrication technology are necessary to produce large components of high temperature alloys. Long time property data must be obtained under environments of interest to assure high component reliability.
Advanced Rankine and Brayton cycle power systems - Materials needs and opportunities
Grisaffe, S. J.; Guentert, D. C.
1974-01-01
Conceptual advanced potassium Rankine and closed Brayton power conversion cycles offer the potential for improved efficiency over steam systems through higher operating temperatures. However, for utility service of at least 100,000 hours, materials technology advances will be needed for such high temperature systems. Improved alloys and surface protection must be developed and demonstrated to resist coal combustion gases as well as potassium corrosion or helium surface degradation at high temperatures. Extensions in fabrication technology are necessary to produce large components of high temperature alloys. Long-time property data must be obtained under environments of interest to assure high component reliability.
International Nuclear Information System (INIS)
1993-01-01
The Electric Power Annual is prepared by the Survey Management Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA); US Department of Energy. The 1991 edition has been enhanced to include statistics on electric utility demand-side management and nonutility supply. ''The US Electric Power Industry at a Glance'' section presents a profile of the electric power industry ownership and performance, and a review of key statistics for the year. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; electricity sales, revenue, and average revenue per kilowatthour sold; financial statistics; environmental statistics; electric power transactions; demand-side management; and nonutility power producers. In addition, the appendices provide supplemental data on major disturbances and unusual occurrences in US electricity power systems. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter. Monetary values in this publication are expressed in nominal terms
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Wind energy statistics 2012; Vindkraftsstatistik 2012
Energy Technology Data Exchange (ETDEWEB)
NONE
2013-04-15
The publication 'Wind Energy Statistics' is an annual publication. Since 2010, the reported statistics of installed power, number of plants and regional distribution, even semi-annually, and in tabular form on the Agency's website. The publication is produced in a new way this year, which will result in some data differ from previous publications. Due to the certificate system there is basically full statistics on wind energy in this publication which are presented in different styles. Here we present the regional distribution, ie. how the number of turbines and installed capacity is allocated to counties and municipalities. The electricity produced divided by county, where for reasons of confidentiality possible, are also reported. The wind power is becoming increasingly important in the Swedish energy system which provides an increased demand for statistics and other divisions than that presented in the official statistics. Therefore, this publication, which are not official statistics, has been developed.
Use of Mathematical Methods of Statistics for Analyzing Engine Characteristics
Directory of Open Access Journals (Sweden)
Aivaras Jasilionis
2012-11-01
Full Text Available For the development of new models, automobile manufacturers are trying to come up with optimal software for engine control in all movement modes. However, in this case, a vehicle cannot reach outstanding characteristics in none of them. This is the main reason why modifications in engine control software used for adapting the vehicle for driver’s needs are becoming more and more popular. The article presents a short analysis of development trends towards engine control software. Also, models of mathematical statistics for engine power and torque growth are created. The introduced models give an opportunity to predict the probabilities of engine power or torque growth after individual reprogramming of engine control software.
International Nuclear Information System (INIS)
Moeller, D.W.
1980-01-01
This review of occupational exposures in commercial nuclear power plants in the United States of America has revealed that, although many problem areas are being adequately addressed, there is a need for additional work. Areas relative to exposure evaluation that need attention include better data collection and analysis as to when and where exposures occur, improved information on exposures from internally deposited radionuclides, improved techniques for monitoring occupational neutron exposures, and an upgrading in quality control procedures in the manufacture, calibration, use and maintenance of monitoring instruments. Areas relative to exposure control that need attention include the development of additional design and manufacturing approaches for preventing the production and build-up of key radionuclides within reactor cooling systems, the development and testing of techniques for removing those radionuclides that do accumulate in such systems, the application of risk/benefit assessments to procedures for the maintenance, repair, modification, replacement and disposal of major nuclear power plant components, such as steam generators, and the development of design features to facilitate the decommissioning of nuclear power plants. Professional radiation workers also have to be aware of the impact that the proposed major reductions in occupational dose limits would have on their operations. This impact is compounded by the fact that the number of people receiving graduate education in radiation protection in the USA is decreasing. (author)
The need and direction of a human factors research program for the nuclear power industry
International Nuclear Information System (INIS)
Blackman, H.S.; Meyer, O.R.; Nelson, W.R.
1986-01-01
It is axiomatic that the need for a human factors program in the nuclear power industry must be based upon an examination of the process of nuclear energy production and the role that the human plays in this process. It has been pointed out by others that a large number of incidents in technology based industries can be attributed to human error, thereby demonstrating the need to understand the human in interacting with complex processes. But an emphasis upon human ''error'' is a negative approach and can be non-productive, particularly when the ''correct'' human action has not been clearly defined prior to the incident. Some industries have expended great resources in a positive attempt to maximize the performance of the human in critical roles, e.g., the man-in-space program, the commercial airlines industry, deep-sea exploration. Central to this issue of human factors in nuclear power is the question of the role that the human plays in reducing the risk of the total system. If, as in other areas of application, the nuclear industry can make substantial improvements in the performance of humans, one needs to know how much risk is really reduced
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
International Nuclear Information System (INIS)
Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.
1978-01-01
The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Data management and statistical analysis for environmental assessment
International Nuclear Information System (INIS)
Wendelberger, J.R.; McVittie, T.I.
1995-01-01
Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities
Descriptive statistics of occupational employment in nuclear power utilities. Final working paper
International Nuclear Information System (INIS)
Little, J.R.; Johnson, R.C.
1982-10-01
The Institute of Nuclear Power Operations conducted a survey of its 58 member utilities during the Spring of 1982. This was the second such survey performed to identify employment trends and to project needs for trained personnel in the industry to 1991. The first was performed in 1981. The 1982 employment survey consisted of four questionnaires, asking for information on: (1) on-site employment; (2) on-site turnover; (3) off-site employment; and (4) off-site turnover. The survey instruments were designed to reflect approaches used by the utilities to meet the labor requirements for operation of nuclear power plants through off-site support personnel, contractors, and holding company personnel, as well as utility employees working at the plant site. On-site information was received from all 83 plants at the 58 utilities. However, employment information from Surry of VEPCO arrived too late to be included in the analysis. Therefore, their numbers are reflected in the adjusted totals. Responses to requests for off-site employment information were received from 55 of the 58 utilities
Excel 2010 for environmental sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach environmental sciences statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical environmental sciences problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in environmental science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2010 for Environmental Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Eac...
Excel 2016 for physical sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical physical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel is an effective learning tool for quantitative analyses in environmental science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel 2016 to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader to use Excel commands to solve specific, easy-to-understand physical science problems. Practice problems are provided at the end of each chapter with their s...
Excel 2016 for environmental sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical environmental science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel is an effective learning tool for quantitative analyses in environmental science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Environmental Science Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel 2016 to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader to use Excel commands to solve specific, easy-to-understand environmental science problems. Practice problems are provided at the end of each chapte...
Excel 2016 for health services management statistics a guide to solving problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching health services management statistics effectively. Similar to the previously published Excel 2013 for Health Services Management Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical health service management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in health service courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Health Services Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply...
Excel 2013 for environmental sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach environmentall sciences statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical environmental science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in environmental science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Environmental Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chap...
A systematic approach to the training in the nuclear power industry: The need for standard
International Nuclear Information System (INIS)
Wilkinson, J.D.
1995-01-01
The five elements of a open-quotes Systematic Approach to Trainingclose quotes (SAT) are analysis, design, development, implementation and evaluation. These elements are also present in the effective application of basic process control. The fundamental negative feedback process control loop is therefore an excellent model for a successful, systematic approach to training in the nuclear power industry. Just as standards are required in today's manufacturing and service industries, eg ISO 9000, so too are control standards needed in the training industry and in particular in the training of nuclear power plant staff. The International Atomic Energy Agency (IAEA) produced its TECDOC 525 on open-quotes Training to Establish and Maintain the Qualification and Competence of Nuclear Power Plant Operations Personnelclose quotes in 1989 and the American Nuclear Society published its open-quotes Selection, Qualification, and Training of Personnel for Nuclear Power Plants, an American National Standardclose quotes in 1993. It is important that community colleges, training vendors and organizations such as the Instrument Society of America (ISA), who may be supplying basic or prerequisite training to the nuclear power industry, become aware of these and other standards relating to training in the nuclear power industry
Excel 2013 for educational and psychological statistics a guide to solving practical problems
Quirk, Thomas J
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach educational and psychological statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical problems in education and psychology. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and practitioners, is also an effective teaching and learning tool for quantitative analyses in statistics courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Educational and Psychological Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and practitioners how to apply Excel to statistical techniques necessary in their courses and work. E...
The statistical analysis of anisotropies
International Nuclear Information System (INIS)
Webster, A.
1977-01-01
One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)
Country Nuclear Power Profiles - 2009 Edition
International Nuclear Information System (INIS)
2009-08-01
The Country Nuclear Power Profiles compiles background information on the status and development of nuclear power programs in Member States. It consists of organizational and industrial aspects of nuclear power programs and provides information about the relevant legislative, regulatory, and international framework in each country. Its descriptive and statistical overview of the overall economic, energy, and electricity situation in each country, and its nuclear power framework is intended to serve as an integrated source of key background information about nuclear power programs in the world. The preparation of Country Nuclear Power Profiles (CNPP) was initiated in 1990s. It responded to a need for a database and a technical publication containing a description of the energy and economic situation, the energy and the electricity sector, and the primary organizations involved in nuclear power in IAEA Member States. This is the 2009 edition issued on CD-ROM and Web pages. It updates the country information for 44 countries. The CNPP is updated based on information voluntarily provided by participating IAEA Member States. Participants include the 30 countries that have operating nuclear power plants, as well as 14 countries having past or planned nuclear power programmes (Bangladesh, Egypt, Ghana, Indonesia, the Islamic Republic of Iran, Italy, Kazakhstan, Nigeria, Philippines, Poland, Thailand, Tunisia, Turkey and Vietnam). For the 2009 edition, 26 countries provided updated or new profiles. For the other countries, the IAEA updated the profile statistical tables on nuclear power, energy development, and economic indicators based on information from IAEA and World Bank databases
Country Nuclear Power Profiles - 2011 Edition
International Nuclear Information System (INIS)
2011-08-01
The Country Nuclear Power Profiles compiles background information on the status and development of nuclear power programs in Member States. It consists of organizational and industrial aspects of nuclear power programs and provides information about the relevant legislative, regulatory, and international framework in each country. Its descriptive and statistical overview of the overall economic, energy, and electricity situation in each country, and its nuclear power framework is intended to serve as an integrated source of key background information about nuclear power programs in the world. The preparation of Country Nuclear Power Profiles (CNPP) was initiated in 1990s. It responded to a need for a database and a technical publication containing a description of the energy and economic situation, the energy and the electricity sector, and the primary organizations involved in nuclear power in IAEA Member States. This is the 2011 edition issued on CD-ROM and Web pages. It updates the country information for 50 countries. The CNPP is updated based on information voluntarily provided by participating IAEA Member States. Participants include the 29 countries that have operating nuclear power plants, as well as 21 countries having past or planned nuclear power programmes (Bangladesh, Belarus, Chile, Egypt, Ghana, Indonesia, the Islamic Republic of Iran, Italy, Jordan, Kazakhstan, Kuwait, Lithuania, Morocco, Nigeria, Philippines, Poland, Syrian Arab Republic, Thailand, Tunisia, Turkey and Vietnam). For the 2011 edition, 23 countries provided updated or new profiles. For the other countries, the IAEA updated the profile statistical tables on nuclear power, energy development, and economic indicators based on information from IAEA and World Bank databases.
Excel 2016 for educational and psychological statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching educational and psychological statistics effectively. Similar to the previously published Excel 2013 for Educational and Psychological Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical education and psychology problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in education and psychology courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Educational and Psychological Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and man...
Xu, Liangfei; Reimer, Uwe; Li, Jianqiu; Huang, Haiyan; Hu, Zunyan; Jiang, Hongliang; Janßen, Holger; Ouyang, Minggao; Lehnert, Werner
2018-02-01
City buses using polymer electrolyte membrane (PEM) fuel cells are considered to be the most likely fuel cell vehicles to be commercialized in China. The technical specifications of the fuel cell systems (FCSs) these buses are equipped with will differ based on the powertrain configurations and vehicle control strategies, but can generally be classified into the power-follow and soft-run modes. Each mode imposes different levels of electrochemical stress on the fuel cells. Evaluating the aging behavior of fuel cell stacks under the conditions encountered in fuel cell buses requires new durability test protocols based on statistical results obtained during actual driving tests. In this study, we propose a systematic design method for fuel cell durability test protocols that correspond to the power-follow mode based on three parameters for different fuel cell load ranges. The powertrain configurations and control strategy are described herein, followed by a presentation of the statistical data for the duty cycles of FCSs in one city bus in the demonstration project. Assessment protocols are presented based on the statistical results using mathematical optimization methods, and are compared to existing protocols with respect to common factors, such as time at open circuit voltage and root-mean-square power.
Basics of modern mathematical statistics
Spokoiny, Vladimir
2015-01-01
This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.
Medical Statistics – Mathematics or Oracle? Farewell Lecture
Directory of Open Access Journals (Sweden)
Gaus, Wilhelm
2005-06-01
Full Text Available Certainty is rare in medicine. This is a direct consequence of the individuality of each and every human being and the reason why we need medical statistics. However, statistics have their pitfalls, too. Fig. 1 shows that the suicide rate peaks in youth, while in Fig. 2 the rate is highest in midlife and Fig. 3 in old age. Which of these contradictory messages is right? After an introduction to the principles of statistical testing, this lecture examines the probability with which statistical test results are correct. For this purpose the level of significance and the power of the test are compared with the sensitivity and specificity of a diagnostic procedure. The probability of obtaining correct statistical test results is the same as that for the positive and negative correctness of a diagnostic procedure and therefore depends on prevalence. The focus then shifts to the problem of multiple statistical testing. The lecture demonstrates that for each data set of reasonable size at least one test result proves to be significant - even if the data set is produced by a random number generator. It is extremely important that a hypothesis is generated independently from the data used for its testing. These considerations enable us to understand the gradation of "lame excuses, lies and statistics" and the difference between pure truth and the full truth. Finally, two historical oracles are cited.
Energy Technology Data Exchange (ETDEWEB)
1994-01-06
The Electric Power Annual presents a summary of electric utility statistics at national, regional and State levels. The objective of the publication is to provide industry decisionmakers, government policymakers, analysts and the general public with historical data that may be used in understanding US electricity markets. The Electric Power Annual is prepared by the Survey Management Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA); US Department of Energy. ``The US Electric Power Industry at a Glance`` section presents a profile of the electric power industry ownership and performance, and a review of key statistics for the year. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; retail sales; revenue; financial statistics; environmental statistics; electric power transactions; demand-side management; and nonutility power producers. In addition, the appendices provide supplemental data on major disturbances and unusual occurrences in US electricity power systems. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter. Monetary values in this publication are expressed in nominal terms.
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
Statistical inquiry on the reliability of emergency diesel stations in German nuclear power plants
International Nuclear Information System (INIS)
1983-01-01
This statistic inquiry is based on 692 occurrances in 40 diesel stations of 10 German nuclear power plants. Various parameters influencing the failure behaviour of diesel stations were investigated on only significant plant-specific influences and the impact of diesel station circuitry on failure behaviour were established. According to the results of this inquiry, running time, start-up number and increasing operational experience do not apparently influence the failure behaviour of diesel stations. The expected failure probability of diesel stations varies with the different nuclear power plants. Taking into account both start-up and operational failures, (with monthly inspections and running times of up to 2 h), this value is in the range of 1.6 x 10 -2 to 1.7 x 10 -3 per application. Considering failure data of all diesel stations, the failure probability (start-up and operational failures) is 8.1 x 10 -3 per application. On account of the two common-mode failures registered, a common-mode failure probability of 10 -3 was established. The inquiry also showed that non-availability of diesel stations is essentially determined by maintenance intervals. (orig.) [de
Statistical Modeling of Large-Scale Signal Path Loss in Underwater Acoustic Networks
Directory of Open Access Journals (Sweden)
Manuel Perez Malumbres
2013-02-01
Full Text Available In an underwater acoustic channel, the propagation conditions are known to vary in time, causing the deviation of the received signal strength from the nominal value predicted by a deterministic propagation model. To facilitate a large-scale system design in such conditions (e.g., power allocation, we have developed a statistical propagation model in which the transmission loss is treated as a random variable. By applying repetitive computation to the acoustic field, using ray tracing for a set of varying environmental conditions (surface height, wave activity, small node displacements around nominal locations, etc., an ensemble of transmission losses is compiled and later used to infer the statistical model parameters. A reasonable agreement is found with log-normal distribution, whose mean obeys a log-distance increases, and whose variance appears to be constant for a certain range of inter-node distances in a given deployment location. The statistical model is deemed useful for higher-level system planning, where simulation is needed to assess the performance of candidate network protocols under various resource allocation policies, i.e., to determine the transmit power and bandwidth allocation necessary to achieve a desired level of performance (connectivity, throughput, reliability, etc..
Modern applied statistics with s-plus
Venables, W N
1997-01-01
S-PLUS is a powerful environment for the statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-PLUS to perform statistical analyses and provides both an introduction to the use of S-PLUS and a course in modern statistical methods. S-PLUS is available for both Windows and UNIX workstations, and both versions are covered in depth. The aim of the book is to show how to use S-PLUS as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-PLUS, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets. Many of the methods discussed are state-of-the-art approaches to topics such as linear and non-linear regression models, robust a...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Ensemble forecasting using sequential aggregation for photovoltaic power applications
International Nuclear Information System (INIS)
Thorey, Jean
2017-01-01
Our main objective is to improve the quality of photovoltaic power forecasts deriving from weather forecasts. Such forecasts are imperfect due to meteorological uncertainties and statistical modeling inaccuracies in the conversion of weather forecasts to power forecasts. First we gather several weather forecasts, secondly we generate multiple photovoltaic power forecasts, and finally we build linear combinations of the power forecasts. The minimization of the Continuous Ranked Probability Score (CRPS) allows to statistically calibrate the combination of these forecasts, and provides probabilistic forecasts under the form of a weighted empirical distribution function. We investigate the CRPS bias in this context and several properties of scoring rules which can be seen as a sum of quantile-weighted losses or a sum of threshold-weighted losses. The minimization procedure is achieved with online learning techniques. Such techniques come with theoretical guarantees of robustness on the predictive power of the combination of the forecasts. Essentially no assumptions are needed for the theoretical guarantees to hold. The proposed methods are applied to the forecast of solar radiation using satellite data, and the forecast of photovoltaic power based on high-resolution weather forecasts and standard ensembles of forecasts. (author) [fr
Cyber security threats in the power sector: Need for a domain specific regulatory framework in India
International Nuclear Information System (INIS)
Ananda Kumar, V.; Pandey, Krishan K.; Punia, Devendra Kumar
2014-01-01
India is poised to spend over USD 5.8 billion as part of the National Smart Grid Mission aimed to alleviate India's ailing power sector as part of its 12th Five year plan (2012–2017). The federal government sponsored Restructured Accelerated Power Development and Reforms Program (R-APDRP) is also focused on building ICT capability in the state electricity boards. Presently however, there is no power sector specific cyber security mandates or policies in India. The Stuxnet, Shamoon and Anonymous incidents have shown that cyber attacks can cause significant damage and pose a risk to National Critical Infrastructure. A lack of security planning as part of designing the Smart grids can potentially leave gaping holes in the country's power sector stability. The paper highlights key cyber security threats across the entire power sector value chain—from generation, to transmission and distribution. It is aimed at building the case for power sector specific cyber security regulations based on the experience of regulators in other critical infrastructure sectors like Banking and Telecom in India and power sector regulations internationally. - Highlights: • Cyber security in power sector is key to protecting national critical infrastructure. • Poor cyber security planning would impact the power sector in India. • A laissez-faire approach to cyber security in power sector may not yield results. • There is a need for power sector specific cyber security regulations
Energy Technology Data Exchange (ETDEWEB)
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Directory of Open Access Journals (Sweden)
David Normando
2010-02-01
Full Text Available A seleção de métodos apropriados para a análise estatística pode parecer complexa, principalmente para estudantes de pós-graduação e pesquisadores no início da carreira científica. Por outro lado, a apresentação em PowerPoint é uma ferramenta comum para estudantes e pesquisadores. Assim, um tutorial de Bioestatística desenvolvido em uma apresentação em PowerPoint poderia estreitar a distância entre ortodontistas e a Bioestatística. Esse guia proporciona informações úteis e objetivas a respeito de vários métodos estatísticos empregando exemplos relacionados à Odontologia e, mais especificamente, à Ortodontia. Esse tutorial deve ser empregado, principalmente, para o usuário obter algumas respostas a questões comuns relacionadas ao teste mais apropriado para executar comparações entre grupos, examinar correlações e regressões ou analisar o erro do método. Também pode ser obtido auxílio para checar a distribuição dos dados (normal ou anormal e a escolha do gráfico mais adequado para a apresentação dos resultados. Esse guia* pode ainda ser de bastante utilidade para revisores de periódicos examinarem, de forma rápida, a adequabilidade do método estatístico apresentado em um artigo submetido à publicação.Selecting appropriate methods for statistical analysis may be difficult, especially for the students and others in the early phases of the research career. On the other hand, PowerPoint presentation is a very common tool to researchers and dental students, so a statistical guide based on PowerPoint could narrow the gap between orthodontist and the Biostatistics. This guide provides objective and useful information about several statistical methods using examples related to the dental field. A Power-Point presentation is employed to assist the user to find answers to common questions regarding Biostatistics, such as the most appropriate statistical test to compare groups, to make correlations and
Excel 2010 for health services management statistics a guide to solving practical problems
Quirk, Thomas J
2014-01-01
This is the first book to show the capabilities of Microsoft Excel to teach health services management statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical health services management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in health services management courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2010 for Health Services Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work....
Excel 2013 for human resource management statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows how Microsoft Excel is able to teach human resource management statistics effectively. Similar to the previously published Excel 2010 for Human Resource Management Statistics, it is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical human resource management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in human resource management courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Human Resource Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to ...
Excel 2016 for human resource management statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching human resource management statistics effectively. Similar to the previously published Excel 2013 for Human Resource Management Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical human resource management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in human resource management courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Human Resource Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how ...
Excel 2013 for health services management statistics a guide to solving practical problems
Quirk, Thomas J
2016-01-01
This book shows the capabilities of Microsoft Excel to teach health services management statistics effectively. Similar to the previously published Excel 2010 for Health Services Management Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical health services management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in health services management courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Health Services Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers ho...
Excel 2010 for human resource management statistics a guide to solving practical problems
Quirk, Thomas J
2014-01-01
This is the first book to show the capabilities of Microsoft Excel to teach human resource management statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical human resource management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in human resource management courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2010 for Human Resource Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and ...
Informing Evidence Based Decisions: Usage Statistics for Online Journal Databases
Directory of Open Access Journals (Sweden)
Alexei Botchkarev
2017-06-01
Full Text Available Abstract Objective – The primary objective was to examine online journal database usage statistics for a provincial ministry of health in the context of evidence based decision-making. In addition, the study highlights implementation of the Journal Access Centre (JAC that is housed and powered by the Ontario Ministry of Health and Long-Term Care (MOHLTC to inform health systems policy-making. Methods – This was a prospective case study using descriptive analysis of the JAC usage statistics of journal articles from January 2009 to September 2013. Results – JAC enables ministry employees to access approximately 12,000 journals with full-text articles. JAC usage statistics for the 2011-2012 calendar years demonstrate a steady level of activity in terms of searches, with monthly averages of 5,129. In 2009-2013, a total of 4,759 journal titles were accessed including 1,675 journals with full-text. Usage statistics demonstrate that the actual consumption was over 12,790 full-text downloaded articles or approximately 2,700 articles annually. Conclusion – JAC’s steady level of activities, revealed by the study, reflects continuous demand for JAC services and products. It testifies that access to online journal databases has become part of routine government knowledge management processes. MOHLTC’s broad area of responsibilities with dynamically changing priorities translates into the diverse information needs of its employees and a large set of required journals. Usage statistics indicate that MOHLTC information needs cannot be mapped to a reasonably compact set of “core” journals with a subsequent subscription to those.
Statistical power analysis for the behavioral sciences
National Research Council Canada - National Science Library
Cohen, Jacob
1988-01-01
... offers a unifying framework and some new data-analytic possibilities. 2. A new chapter (Chapter 11) considers some general topics in power analysis in more integrted form than is possible in the earlier...
Statistical power analysis for the behavioral sciences
National Research Council Canada - National Science Library
Cohen, Jacob
1988-01-01
.... A chapter has been added for power analysis in set correlation and multivariate methods (Chapter 10). Set correlation is a realization of the multivariate general linear model, and incorporates the standard multivariate methods...
Need for a probabilistic fire analysis at nuclear power plants
International Nuclear Information System (INIS)
Calabuig Beneyto, J. L.; Ibanez Aparicio, J.
1993-01-01
Although fire protection standards for nuclear power plants cover a wide scope and are constantly being updated, the existence of certain constraints makes it difficult to precisely evaluate plant response to different postulatable fires. These constraints involve limitations such as: - Physical obstacles which impede the implementation of standards in certain cases; - Absence of general standards which cover all the situations which could arise in practice; - Possible temporary noncompliance of safety measures owing to unforeseen circumstances; - The fact that a fire protection standard cannot possibly take into account additional damages occurring simultaneously with the fire; Based on the experience of the ASCO NPP PSA developed within the framework of the joint venture, INITEC-INYPSA-EMPRESARIOS AGRUPADOS, this paper seeks to justify the need for a probabilistic analysis to overcome the limitations detected in general application of prevailing standards. (author)
Sex differences in discriminative power of volleyball game-related statistics.
João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime
2010-12-01
To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.
Energy statistics: Fourth quarter, 1989
International Nuclear Information System (INIS)
Anon.
1989-01-01
This volume contains 100 tables compiling data into the following broad categories: energy, drilling, natural gas, gas liquids, oil, coal, peat, electricity, uranium, and business indicators. The types of data that are given include production and consumption statistics, reserves, imports and exports, prices, fossil fuel and nuclear power generation statistics, and price indices
Safety aspects of nuclear power stations
International Nuclear Information System (INIS)
Binner, W.
1980-01-01
Psychological aspects of the fear of nuclear power are discussed, cancer deaths due to a nuclear accident are predicted and the need for nuclear accident prevention is stressed. A simplified analysis of the safety precautions in a generalised nuclear power station is offered, with reference to loss-of-coolant incidents, and developments in reactor design for fail-safe modes are explained. The importance of learning from the Three Mile Island incident is noted and failure statistics are presented. Tasks to be undertaken at the Austrian Zwentendorf nuclear power station are listed, including improved quality control and acoustic detectors. Precautions against earthquakes are also discussed and it is stated that safe operation of the Zwentendorf station will be achieved. (G.M.E.)
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.
Festing, M F
2001-01-01
In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.
Danish electricity supply. Statistics 2003
International Nuclear Information System (INIS)
2004-01-01
The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2003 for consumption, prices of electric power, power generation and transmission, and trade. (ln)
Danish electricity supply. Statistics 2000
International Nuclear Information System (INIS)
2001-07-01
The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2000 for consumption, prices of electric power; power generation and transmission, and trade. (ln)
Danish electricity supply. Statistics 2002
International Nuclear Information System (INIS)
2003-01-01
The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2002 for consumption, prices of electric power; power generation and transmission, and trade. (ln)
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
Energy Technology Data Exchange (ETDEWEB)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.
Gontscharuk, Veronika; Landwehr, Sandra; Finner, Helmut
2015-01-01
The higher criticism (HC) statistic, which can be seen as a normalized version of the famous Kolmogorov-Smirnov statistic, has a long history, dating back to the mid seventies. Originally, HC statistics were used in connection with goodness of fit (GOF) tests but they recently gained some attention in the context of testing the global null hypothesis in high dimensional data. The continuing interest for HC seems to be inspired by a series of nice asymptotic properties related to this statistic. For example, unlike Kolmogorov-Smirnov tests, GOF tests based on the HC statistic are known to be asymptotically sensitive in the moderate tails, hence it is favorably applied for detecting the presence of signals in sparse mixture models. However, some questions around the asymptotic behavior of the HC statistic are still open. We focus on two of them, namely, why a specific intermediate range is crucial for GOF tests based on the HC statistic and why the convergence of the HC distribution to the limiting one is extremely slow. Moreover, the inconsistency in the asymptotic and finite behavior of the HC statistic prompts us to provide a new HC test that has better finite properties than the original HC test while showing the same asymptotics. This test is motivated by the asymptotic behavior of the so-called local levels related to the original HC test. By means of numerical calculations and simulations we show that the new HC test is typically more powerful than the original HC test in normal mixture models. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Electric power annual 1995. Volume II
Energy Technology Data Exchange (ETDEWEB)
NONE
1996-12-01
This document summarizes pertinent statistics on various aspects of the U.S. electric power industry for the year and includes a graphic presentation. Data is included on electric utility retail sales and revenues, financial statistics, environmental statistics of electric utilities, demand-side management, electric power transactions, and non-utility power producers.
Basic statistics an introduction with R
Raykov, Tenko
2012-01-01
Basic Statistics provides an accessible and comprehensive introduction to statistics using the free, state-of-the-art, powerful software program R. This book is designed to both introduce students to key concepts in statistics and to provide simple instructions for using R.Teaches essential concepts in statistics, assuming little background knowledge on the part of the readerIntroduces students to R with as few sub-commands as possible for ease of useProvides practical examples from the educational, behavioral, and social sciencesBasic Statistics will appeal to students and professionals acros
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
International Nuclear Information System (INIS)
1992-01-01
The Electric Power Annual presents a summary of electric utility statistics at the national, regional and State levels. The objective of the publication is to provide industry decisionmakers, government policy-makers, analysts and the general public with historical data that may be used in understanding US electricity markets. ''The Industry at a Glance'' section presents a profile of the electric power industry ownership and performance; a review of key statistics for the year; and projections for various aspects of the electric power industry through 2010. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; electricity sales, revenue, and average revenue per kilowatthour sold; financial statistics; environmental statistics; and electric power transactions. In addition, appendices provide supplemental data on major disturbances and unusual occurrences. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter
Wind power error estimation in resource assessments.
Directory of Open Access Journals (Sweden)
Osvaldo Rodríguez
Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.
Wind power error estimation in resource assessments.
Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel
2015-01-01
Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
Meeting the Pacific Rim's changing electric power needs
International Nuclear Information System (INIS)
Hammons, T.J.
1994-01-01
This article describes the presentations made at the 1994 Asian Electric Conference. The topics discussed in detail include a successfully implemented strategy for building power projects, a review of the reforms taking place as Australia moves toward a competitive national electricity market by July 1995, the reorganizing of Japan's electric power market, and the electricity reform program in Pakistan
Excel 2016 for biological and life sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical biological and life science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel is an effective learning tool for quantitative analyses in biological and life sciences courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Biological and Life Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel 2016 to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader to use Excel commands to solve specific, easy-to-understand biological and life science problems. Practice problems are provided...
Statistical methods in nuclear material accountancy: Past, present and future
International Nuclear Information System (INIS)
Pike, D.J.; Woods, A.J.
1983-01-01
The analysis of nuclear material inventory data is motivated by the desire to detect any loss or diversion of nuclear material, insofar as such detection may be feasible by statistical analysis of repeated inventory and throughput measurements. The early regulations, which laid down the specifications for the analysis of inventory data, were framed without acknowledging the essentially sequential nature of the data. It is the broad aim of this paper to discuss the historical nature of statistical analysis of inventory data including an evaluation of why statistical methods should be required at all. If it is accepted that statistical techniques are required, then two main areas require extensive discussion. First, it is important to assess the extent to which stated safeguards aims can be met in practice. Second, there is a vital need for reassessment of the statistical techniques which have been proposed for use in nuclear material accountancy. Part of this reassessment must involve a reconciliation of the apparent differences in philosophy shown by statisticians; but, in addition, the techniques themselves need comparative study to see to what extent they are capable of meeting realistic safeguards aims. This paper contains a brief review of techniques with an attempt to compare and contrast the approaches. It will be suggested that much current research is following closely similar lines, and that national and international bodies should encourage collaborative research and practical in-plant implementations. The techniques proposed require credibility and power; but at this point in time statisticians require credibility and a greater level of unanimity in their approach. A way ahead is proposed based on a clear specification of realistic safeguards aims, and a development of a unified statistical approach with encouragement for the performance of joint research. (author)
Energy Technology Data Exchange (ETDEWEB)
Matausek, M V; Kocic, A; Marinkovic, N; Milosevic, M; Stancic, V [Boris Kidric Institute of nuclear sciences Vinca, Belgrade (Yugoslavia)
1978-07-01
Within the Nuclear Engineering Laboratory of the Boris Kidric Institute of Nuclear Sciences (NET IBK) a systematic work has been performed on collecting nuclear data for reactor calculation needs, on developing own methods and computing programs for reactor calculations, as well as on adapting and applying the foreign methods and codes. In this way a complete library of computer programs was formed for precise prediction of nuclear fuel burnup and depletion, for evaluation of the Power distribution variations with irradiation, for computing the amount of produced plutonium and its number densities etc. Programs for evaluation of location of different types of safety and economic analysis have been developed as well. The aim of this paper is to present our abilities to perform complex computations needed for planning, constructing and operating the nuclear power plants, by describing the NET IBK computer programs package. (author)
Koenig, Thomas; Kottlow, Mara; Stein, Maria; Melie-García, Lester
2011-01-01
We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.
Annual statistical information 1996; Informe estatistico anual 1996
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-12-31
This annual statistical report aims to propagate the information about the generation, transmission and distribution systems evolution and about the electric power market from the Parana State, Brazil, in 1996. The electric power consumption in the distribution area of the Parana Power Company (COPEL) presented a growth about 6,7%. The electric power production in the the COPEL plants increased 42,2% higher than 1995, due to the outflows verified in the Iguacu river and to the long period of the affluence reduction that the Southern region tanks coursed during this year. This report presents statistical data about the following topics: a) electric power energy balance from the Parana State; b) electric power energy balance from the COPEL - own generation, certain interchange, electric power requirement, direct distribution and the electric system 6 graphs, 3 maps, 61 tabs.; e-mail: splcnmr at mail.copel.br
Testing and qualification of confidence in statistical procedures
Energy Technology Data Exchange (ETDEWEB)
Serghiuta, D.; Tholammakkil, J.; Hammouda, N. [Canadian Nuclear Safety Commission (Canada); O' Hagan, A. [Sheffield Univ. (United Kingdom)
2014-07-01
This paper discusses a framework for designing artificial test problems, evaluation criteria, and two of the benchmark tests developed under a research project initiated by the Canadian Nuclear Safety Commission to investigate the approaches for qualification of tolerance limit methods and algorithms proposed for application in optimization of CANDU regional/neutron overpower protection trip setpoints for aged conditions. A significant component of this investigation has been the development of a series of benchmark problems of gradually increased complexity, from simple 'theoretical' problems up to complex problems closer to the real application. The first benchmark problem discussed in this paper is a simplified scalar problem which does not involve extremal, maximum or minimum, operations, typically encountered in the real applications. The second benchmark is a high dimensional, but still simple, problem for statistical inference of maximum channel power during normal operation. Bayesian algorithms have been developed for each benchmark problem to provide an independent way of constructing tolerance limits from the same data and allow assessing how well different methods make use of those data and, depending on the type of application, evaluating what the level of 'conservatism' is. The Bayesian method is not, however, used as a reference method, or 'gold' standard, but simply as an independent review method. The approach and the tests developed can be used as a starting point for developing a generic suite (generic in the sense of potentially applying whatever the proposed statistical method) of empirical studies, with clear criteria for passing those tests. Some lessons learned, in particular concerning the need to assure the completeness of the description of the application and the role of completeness of input information, are also discussed. It is concluded that a formal process which includes extended and detailed benchmark
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
Woodgate, Roberta L; Zurba, Melanie; Edwards, Marie; Ripat, Jacquie D; Rempel, Gina
2017-07-01
This paper presents research findings that advance knowledge around the power and agency families with children with complex care needs (CCN). Our conceptual framework uses concepts from geography towards situating the experiences and social realities of family carers within the 'embodied space of care'. The data originate from a longitudinal qualitative study of Canadian families with children with CCN. Findings reveal that interactions and decision-making processes relating to health and everyday life were complex and socially interconnected, and emphasize the need for provisions for family-based decision-making and enhanced social inclusion of families and the importance of the renegotiation of power. Copyright © 2017 Elsevier Ltd. All rights reserved.
Swiss electricity statistics 2005
International Nuclear Information System (INIS)
2006-01-01
This comprehensive report made by the Swiss Federal Office of Energy (SFOE) presents the statistics for 2005 on electricity production and usage in Switzerland for the year 2005. First of all, an overview of Switzerland's electricity supply in 2005 is presented. Details are noted of the proportions generated by different sources including nuclear, hydro-power, storage schemes and thermal power stations as well as energy transfer with neighbouring countries. A second chapter takes a look at the balance of imports and exports with illustrative flow diagrams along with tables for total figures from 1950 through to 2005. For the summer and winter periods, figures from 1995 to 2005 are presented. The third chapter examines the production of electricity in the various types of power stations and the developments over the years 1950 to 2005, whereby, for example, statistics on regional generation and power station type are looked at. The fourth chapter looks at electricity consumption in various sectors from 1983 to 2005 and compares the figures with international data. The fifth chapter looks at generation, consumption and loading on particular days and chapter six considers energy exchange with Switzerland's neighbours. Chapter seven takes a look at possibilities for extending generation facilities in the period up to 2012
Swiss electricity statistics 2008
International Nuclear Information System (INIS)
2009-06-01
This comprehensive report made by the Swiss Federal Office of Energy (SFOE) presents the statistics for 2008 on electricity production and usage in Switzerland for the year 2008. First of all, an overview of Switzerland's electricity supply in 2008 is presented. Details are noted of the proportions generated by different sources including nuclear, hydro-power, storage schemes and thermal power stations as well as energy transfer with neighbouring countries. A second chapter takes a look at the balance of imports and exports with illustrative flow diagrams along with tables for total figures from 1950 through to 2008. For the summer and winter periods, figures from 1995 to 2008 are presented. The third chapter examines the production of electricity in the various types of power stations and the developments over the years 1950 to 2008, whereby, for example, statistics on regional generation and power station type are looked at. The fourth chapter looks at electricity consumption in various sectors from 1984 to 2008 and compares the figures with international data. The fifth chapter looks at generation, consumption and loading on particular days and chapter six considers energy exchange with Switzerland's neighbours. Chapter seven takes a look at possibilities for extending generation facilities in the period up to 2015
Swiss electricity statistics 2006
International Nuclear Information System (INIS)
2007-01-01
This comprehensive report made by the Swiss Federal Office of Energy (SFOE) presents the statistics on electricity production and usage in Switzerland for the year 2006. First of all, an overview of Switzerland's electricity supply in 2006 is presented. Details are noted of the amounts generated by different sources including nuclear, hydro-power, storage schemes and thermal power stations as well as energy transfer with neighbouring countries. A second chapter takes a look at the balance of imports and exports with illustrative flow diagrams along with tables for total figures from 1950 through to 2006. For the summer and winter periods, figures from 1995 to 2006 are presented. The third chapter examines the production of electricity in the various types of power stations and the developments over the years 1950 to 2006, whereby, for example, statistics on regional generation and power station type are looked at. The fourth chapter looks at electricity consumption in various sectors from 1983 to 2006 and compares the figures with international data. The fifth chapter looks at generation, consumption and loading on particular, selected days and chapter six considers energy exchange with Switzerland's neighbours. Chapter seven takes a look at possibilities for extending generation facilities in the period up to 2013
Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben
2017-09-15
Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Country Nuclear Power Profiles - 2010 Edition
International Nuclear Information System (INIS)
2010-08-01
The Country Nuclear Power Profiles compiles background information on the status and development of nuclear power programs in Member States. It consists of organizational and industrial aspects of nuclear power programs and provides information about the relevant legislative, regulatory, and international framework in each country. Its descriptive and statistical overview of the overall economic, energy, and electricity situation in each country, and its nuclear power framework is intended to serve as an integrated source of key background information about nuclear power programs in the world. The preparation of Country Nuclear Power Profiles (CNPP) was initiated in 1990s. It responded to a need for a database and a technical publication containing a description of the energy and economic situation, the energy and the electricity sector, and the primary organizations involved in nuclear power in IAEA Member States. This is the 2010 edition issued on CD-ROM and Web pages. It updates the country information for 48 countries. The CNPP is updated based on information voluntarily provided by participating IAEA Member States. Participants include the 29 countries that have operating nuclear power plants, as well as 19 countries having past or planned nuclear power programmes (Bangladesh, Belarus, Chile, Egypt, Ghana, Indonesia, the Islamic Republic of Iran, Italy, Jordan, Kazakhstan, Lithuania, Morocco, Nigeria, Philippines, Poland, Thailand, Tunisia, Turkey and Vietnam). For the 2010 edition, 24 countries provided updated or new profiles. For the other countries, the IAEA updated the profile statistical tables on nuclear power, energy development, and economic indicators based on information from IAEA and World Bank databases. The CNPP reports have been prepared by each Member State in accordance with the IAEA format. The IAEA is not responsible for the content of these reports
International Nuclear Information System (INIS)
Gibbons, J.; Fracassi, J.
2006-01-01
In December, 2005 the Ontario Power Authority (OPA) outlined its proposed blueprint for meeting Ontario's electricity needs to 2025 in the document entitled Supply Mix Advice Report. As a result of the actions taken by the current government, the OPA believes that Ontario will have adequate electricity supplies to meet the province's needs until 2013. However, it stated that Ontario will require an additional 15,000 megawatts of new generation capacity between 2013 and 2025. The OPA also recommends that a significant proportion of this new generation capacity be nuclear. The Ontario Clean Air Alliance undertook a review of the OPA report and identified several discrepancies including an over-estimation of Ontario's rate of electricity load growth from 2005 to 2025; an under-estimation of the potential for electricity productivity improvements to reduce electricity demand and raise living standards; an under-estimation of renewable energy supply potential; an under-estimation of the potential for biomass and natural gas fired combined heat and power plants to meet electricity needs and increase the competitiveness of Ontario's industries; an under-estimation of the economic costs and risks of nuclear power; and a biased recommendation for a 70 million dollar resource acquisition budget against energy efficiency investments that would reduce demand and raise living standards. This report provides the Ontario Clean Air Alliances' analysis of the OPA report and presents it own recommendations for how Ontario can increase its electricity productivity and meet its electricity supply needs until 2025. The report concluded that the Government of Ontario should direct the OPA to develop a long-term strategy to raise the price of electricity up to its full cost without raising the electricity bills of low income consumers or impairing the competitiveness of Ontario's industries. It was suggested that Ontario's electricity productivity should be increased to the same level as
Researchers’ Intuitions About Power in Psychological Research
Bakker, Marjan; Hartgerink, Chris H. J.; Wicherts, Jelte M.; van der Maas, Han L. J.
2016-01-01
Many psychology studies are statistically underpowered. In part, this may be because many researchers rely on intuition, rules of thumb, and prior practice (along with practical considerations) to determine the number of subjects to test. In Study 1, we surveyed 291 published research psychologists and found large discrepancies between their reports of their preferred amount of power and the actual power of their studies (calculated from their reported typical cell size, typical effect size, and acceptable alpha). Furthermore, in Study 2, 89% of the 214 respondents overestimated the power of specific research designs with a small expected effect size, and 95% underestimated the sample size needed to obtain .80 power for detecting a small effect. Neither researchers’ experience nor their knowledge predicted the bias in their self-reported power intuitions. Because many respondents reported that they based their sample sizes on rules of thumb or common practice in the field, we recommend that researchers conduct and report formal power analyses for their studies. PMID:27354203
Nuclear power- the inevitable option for future energy needs
International Nuclear Information System (INIS)
Prasad, Y.S.R.
1995-01-01
In the ensuring era development and deployment of electrical power sources will be governed by environmental changes, energy security and economical competitiveness. In the energy-mix scenario nuclear power has the potential and will make significant contributions in the coming decades. It is certain that nuclear power will continue to play a vital role in bridging the widening gap of demand and availability of energy in the years to come. In sum and substance, with the limited energy options available with India, nuclear power must assume greater share to meet the rapidly growing energy demands. Fortunately, country has a sound base for achieving the goal. 14 tabs., 3 figs
International Nuclear Information System (INIS)
2002-10-01
This document summarizes in a series of tables the energy statistical data for France: consumption since 1973; energy supplies (production, imports, exports, stocks) and uses (refining, power production, internal uses, sectoral consumption) for coal, petroleum, gas, electricity, and renewable energy sources; national production and consumption of primary energy; final consumption per sector and per energy source; general indicators (energy bill, US$ change rate, prices, energy independence, internal gross product); projections. Details (resources, uses, prices, imports, internal consumption) are given separately for petroleum, natural gas, electric power and solid mineral fuels. (J.S.)
Inferring Demographic History Using Two-Locus Statistics.
Ragsdale, Aaron P; Gutenkunst, Ryan N
2017-06-01
Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.
International Nuclear Information System (INIS)
Holttinen, H.; Tammelin, B.; Hyvoenen, R.
1997-01-01
The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)
Power of Companies in Supply Chains and Their Effect on Network Development
Directory of Open Access Journals (Sweden)
Tamás Brányi
2015-01-01
Full Text Available A general supply chain functions as a closed cluster and consists of at least three companies: supplier, producer and buyer. In an optimal case the companies within a supply chain are well integrated, partnership rests on trust which results in common strategic decisions. Business practices show that there is a stronger company within the chain that uses its power position to influence network development. The objective of the research is to measure how and what kind of power position is needed to influence the supply chain. The hypothesis states, that power and network development are opposite effects in a supply chain. Statistical examination of data gained from 221 companies state that the company with power position has advantages if the supply chain extends. SPSS analysis proves that the hypothesis is false and opens a new direction of research. Companies within the supply chain have to cope with power structures while cooperating with each other. They tend to look for solutions to ease dependency. Using or misusing power has several factors; mainly they are inherited from the strongest link of the supply chain. This is usually a problem but the results of the statistical analysis show that still a win-win situation is needed for the companies in order to deepen the cooperation. To conclude this research the data shows that the goal is to be more competitive as a chain, not just as a company.
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
Statistical properties of reactor antineutrinos
Rusov, V D; Tarasov, V O; Shaaban, Y
2002-01-01
Based on the properties of the cascade statistics of reactor antineutrinos, the efficient method of searching for neutrino oscillations is offered. The determination of physical parameters of this statistics, i.e. the average number of fissions and the overage number of antineutrinos per fission, requires no a priori knowledge of the geometry and characteristics of the detector, the reactor power, and composition of nuclear fuel.
Current Research and Statistical Practices in Sport Science and a Need for Change
Directory of Open Access Journals (Sweden)
Jake R. Bernards
2017-11-01
Full Text Available Current research ideologies in sport science allow for the possibility of investigators producing statistically significant results to help fit the outcome into a predetermined theory. Additionally, under the current Neyman-Pearson statistical structure, some argue that null hypothesis significant testing (NHST under the frequentist approach is flawed, regardless. For example, a p-value is unable to measure the probability that the studied hypothesis is true, unable to measure the size of an effect or the importance of a result, and unable to provide a good measure of evidence regarding a model or hypothesis. Many of these downfalls are key questions researchers strive to answer following an investigation. Therefore, a shift towards a magnitude-based inference model, and eventually a fully Bayesian framework, is thought to be a better fit from a statistical standpoint and may be an improved way to address biases within the literature. The goal of this article is to shed light on the current research and statistical shortcomings the field of sport science faces today, and offer potential solutions to help guide future research practices.
Well-defined power policy needed to augment power capacity
International Nuclear Information System (INIS)
Gupta, K.
1997-01-01
This paper outlines the importance of energy policy and energy development in both energy production and energy needs. A summary of key points related to energy accounting, energy consumption, energy resources, public utilities and government plans are elaborated
Power Estimation for Gene-Longevity Association Analysis Using Concordant Twins
DEFF Research Database (Denmark)
Tan, Qihua; Zhao, Jing Hua; Kruse, Torben A
2014-01-01
Statistical power is one of the major concerns in genetic association studies. Related individuals such as twins are valuable samples for genetic studies because of their genetic relatedness. Phenotype similarity in twin pairs provides evidence of genetic control over the phenotype variation...... in a population. The genetic association study on human longevity, a complex trait that is under control of both genetic and environmental factors, has been confronted by the small sample sizes of longevity subjects which limit statistical power. Twin pairs concordant for longevity have increased probability...... an approximate of 2- to 3-fold increase in sample sizes needed for longevity cutoff at age 90 as compared with that at age 95. Overall, our results showed high value of twins in genetic association studies on human longevity....
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Excel 2013 for business statistics a guide to solving practical business problems
Quirk, Thomas J
2015-01-01
This is the first book to show the capabilities of Microsoft Excel to teach business statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical business problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in business courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Business Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. �...
Kuhlmann, Ellen; Burau, Viola
2018-01-01
There is now widespread agreement on the benefits of an integrated, people-centred health workforce, but the implementation of new models is difficult. We argue that we need to think about stakeholders and power, if we want to ensure change in the health workforce. We discuss these issues from a governance perspective and suggest a critical approach to stakeholder involvement as an indicator of good governance. Three models of involving stakeholders in health workforce governance can be identified: corporatist professional involvement either in a continental European model of conservative corporatism or in a Nordic model of public corporatism; managerialist and market-centred involvement of professions as organizational agents; and a more inclusive, network-based involvement of plural professional experts at different levels of governance. The power relations embedded in these models of stakeholder involvement have different effects on capacity building for an integrated health workforce.
International Nuclear Information System (INIS)
Park, Dongwon; Park, Keunteag
2013-01-01
Many countries have been proceeding research actively in this area as highlighted these advantages but de facto, there are not official research activities in Korea. This paper is to analyze the current status of research in some countries and describe the possibilities introduce with necessity in Korea. Radiation and the radioactive waste, decay heat, loss of coolant accident that related with safety issue are always emerging as a problem of social acceptance of nuclear power plant. Proportion of U- 235 , uranium fuel is only 3-5% and the rest of U- 238 is discarded as all nuclear waste in accordance with 'Open Cycle.' Even if put to practical use thorium reactor, nuclear waste generated but can significantly reduce the toxicity nuclides of TRU and not only the decay heat that physical cause of Fukushima nuclear power plant accident does not occur but also do not need to worry about the LOCA. Our nuclear power plant have been operated very safety compared to Japan. Relative to unexpected operation stop status in terms of base 5 years previous of Fukushima nuclear accident are 0.5 based on average value in our nuclear power plants on the other hand 0.4 in total 17 units of nuclear power plant subsidiary Tokyo Electric Power co., Japan. It is similar to ours, and is similar to Kansai Electric Power Co. However, total reported incidents of non-severe accidents are much higher than ours. But that cannot be interpreted as the safety of our operating skills of nuclear power plant are excellent compared to other countries. For many people's anxiety about nuclear power plants operating after the Fukushima accident, nuclear operators emphasized the safety of our nuclear power plant with referring to the difference between PWR and BWR. Our nuclear power plant were complementary in many facilities depend on problems revealed at the Fukushima nuclear power plant. But it seems to be need for long-term research and development of next-generation nuclear power plant that safety
Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie
2013-01-01
Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.
Infrastructure needs and organizational aspect of nuclear power programme
International Nuclear Information System (INIS)
Villanueva, M.S.
1996-01-01
I. Introduction. II. Infrastructure development for nuclear power program: a) pre-requisites and requirements for a nuclear power program; b) long-term national policy for a nuclear power (long-term policy reason; national commitment); c) manpower development (role of academic institutions; practical manpower training); d) laws and regulations (regulatory framework; main national laws and regulations); e) nuclear research and development implementation (researches in the university; long term nuclear R and D program; research reactors); f) functions of government organizations (Atomic Energy Commission (PNRI); Department of Science and Technology; Department of Energy; Department of Education and Culture); g) industrial infrastructure; h) technology transfer (recipients's preparedness); i) safeguards obligations; j) public acceptance activities. III. Stages of nuclear power development (stage 1: planning; stage 2: detailed study and procurement; stage 3: construction; stage 4: operation) IV. Conclusion/Recommendation. (author)
A goodness of fit statistic for the geometric distribution
J.A. Ferreira
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results
Primordial statistical anisotropy generated at the end of inflation
International Nuclear Information System (INIS)
Yokoyama, Shuichiro; Soda, Jiro
2008-01-01
We present a new mechanism for generating primordial statistical anisotropy of curvature perturbations. We introduce a vector field which has a non-minimal kinetic term and couples with a waterfall field in a hybrid inflation model. In such a system, the vector field gives fluctuations of the end of inflation and hence induces a subcomponent of curvature perturbations. Since the vector has a preferred direction, the statistical anisotropy could appear in the fluctuations. We present the explicit formula for the statistical anisotropy in the primordial power spectrum and the bispectrum of curvature perturbations. Interestingly, there is the possibility that the statistical anisotropy does not appear in the power spectrum but does appear in the bispectrum. We also find that the statistical anisotropy provides the shape dependence to the bispectrum
Primordial statistical anisotropy generated at the end of inflation
Energy Technology Data Exchange (ETDEWEB)
Yokoyama, Shuichiro [Department of Physics and Astrophysics, Nagoya University, Aichi 464-8602 (Japan); Soda, Jiro, E-mail: shu@a.phys.nagoya-u.ac.jp, E-mail: jiro@tap.scphys.kyoto-u.ac.jp [Department of Physics, Kyoto University, Kyoto 606-8501 (Japan)
2008-08-15
We present a new mechanism for generating primordial statistical anisotropy of curvature perturbations. We introduce a vector field which has a non-minimal kinetic term and couples with a waterfall field in a hybrid inflation model. In such a system, the vector field gives fluctuations of the end of inflation and hence induces a subcomponent of curvature perturbations. Since the vector has a preferred direction, the statistical anisotropy could appear in the fluctuations. We present the explicit formula for the statistical anisotropy in the primordial power spectrum and the bispectrum of curvature perturbations. Interestingly, there is the possibility that the statistical anisotropy does not appear in the power spectrum but does appear in the bispectrum. We also find that the statistical anisotropy provides the shape dependence to the bispectrum.
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Cornillon, Pierre-Andre; Husson, Francois; Jegou, Nicolas; Josse, Julie; Kloareg, Maela; Matzner-Lober, Eric; Rouviere, Laurent
2012-01-01
An Overview of RMain ConceptsInstalling RWork SessionHelpR ObjectsFunctionsPackagesExercisesPreparing DataReading Data from FileExporting ResultsManipulating VariablesManipulating IndividualsConcatenating Data TablesCross-TabulationExercisesR GraphicsConventional Graphical FunctionsGraphical Functions with latticeExercisesMaking Programs with RControl FlowsPredefined FunctionsCreating a FunctionExercisesStatistical MethodsIntroduction to the Statistical MethodsA Quick Start with RInstalling ROpening and Closing RThe Command PromptAttribution, Objects, and FunctionSelectionOther Rcmdr PackageImporting (or Inputting) DataGraphsStatistical AnalysisHypothesis TestConfidence Intervals for a MeanChi-Square Test of IndependenceComparison of Two MeansTesting Conformity of a ProportionComparing Several ProportionsThe Power of a TestRegressionSimple Linear RegressionMultiple Linear RegressionPartial Least Squares (PLS) RegressionAnalysis of Variance and CovarianceOne-Way Analysis of VarianceMulti-Way Analysis of Varian...
Levy, R.; Mcginness, H.
1976-01-01
Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.
Time-Dependent Statistical Analysis of Wide-Area Time-Synchronized Data
Directory of Open Access Journals (Sweden)
A. R. Messina
2010-01-01
Full Text Available Characterization of spatial and temporal changes in the dynamic patterns of a nonstationary process is a problem of great theoretical and practical importance. On-line monitoring of large-scale power systems by means of time-synchronized Phasor Measurement Units (PMUs provides the opportunity to analyze and characterize inter-system oscillations. Wide-area measurement sets, however, are often relatively large, and may contain phenomena with differing temporal scales. Extracting from these measurements the relevant dynamics is a difficult problem. As the number of observations of real events continues to increase, statistical techniques are needed to help identify relevant temporal dynamics from noise or random effects in measured data. In this paper, a statistically based, data-driven framework that integrates the use of wavelet-based EOF analysis and a sliding window-based method is proposed to identify and extract, in near-real-time, dynamically independent spatiotemporal patterns from time synchronized data. The method deals with the information in space and time simultaneously, and allows direct tracking and characterization of the nonstationary time-frequency dynamics of oscillatory processes. The efficiency and accuracy of the developed procedures for extracting localized information of power system behavior from time-synchronized phasor measurements of a real event in Mexico is assessed.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Fundamentals of modern statistical methods substantially improving power and accuracy
Wilcox, Rand R
2001-01-01
Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...
International Nuclear Information System (INIS)
Kaw, P.K.
1993-01-01
It is pointed out that the fusion community world wide has not aggressively pursued a faster pace of development, which can indeed be justified on the basis of its technical accomplishments, because of certain faulty assumptions. Taking some relevant data of energy consumption (based on fossil fuels) and its environmental impact in the projections for developing countries like India and China, it is demonstrated that there is extreme urgency (time-scale of less than 20-25 years) to develop technologies like fusion if one has to prevent stagnation of per capita energy production (and quality of life) in these countries. We conclude by calling for a new aggressive goal for the world wide fusion programme, namely development of a demonstration power plant producing electricity in an environmentally acceptable manner by the year 2015. (author). 6 refs., 5 tabs., 2 figs
Common pitfalls in statistical analysis: "No evidence of effect" versus "evidence of no effect"
Directory of Open Access Journals (Sweden)
Priya Ranganathan
2015-01-01
Full Text Available This article is the first in a series exploring common pitfalls in statistical analysis in biomedical research. The power of a clinical trial is the ability to find a difference between treatments, where such a difference exists. At the end of the study, the lack of difference between treatments does not mean that the treatments can be considered equivalent. The distinction between "no evidence of effect" and "evidence of no effect" needs to be understood.
A comparison of statistical methods for identifying out-of-date systematic reviews.
Directory of Open Access Journals (Sweden)
Porjai Pattanittum
Full Text Available BACKGROUND: Systematic reviews (SRs can provide accurate and reliable evidence, typically about the effectiveness of health interventions. Evidence is dynamic, and if SRs are out-of-date this information may not be useful; it may even be harmful. This study aimed to compare five statistical methods to identify out-of-date SRs. METHODS: A retrospective cohort of SRs registered in the Cochrane Pregnancy and Childbirth Group (CPCG, published between 2008 and 2010, were considered for inclusion. For each eligible CPCG review, data were extracted and "3-years previous" meta-analyses were assessed for the need to update, given the data from the most recent 3 years. Each of the five statistical methods was used, with random effects analyses throughout the study. RESULTS: Eighty reviews were included in this study; most were in the area of induction of labour. The numbers of reviews identified as being out-of-date using the Ottawa, recursive cumulative meta-analysis (CMA, and Barrowman methods were 34, 7, and 7 respectively. No reviews were identified as being out-of-date using the simulation-based power method, or the CMA for sufficiency and stability method. The overall agreement among the three discriminating statistical methods was slight (Kappa = 0.14; 95% CI 0.05 to 0.23. The recursive cumulative meta-analysis, Ottawa, and Barrowman methods were practical according to the study criteria. CONCLUSION: Our study shows that three practical statistical methods could be applied to examine the need to update SRs.
Statistical analysis of nuclear power plant pump failure rate variability: some preliminary results
International Nuclear Information System (INIS)
Martz, H.F.; Whiteman, D.E.
1984-02-01
In-Plant Reliability Data System (IPRDS) pump failure data on over 60 selected pumps in four nuclear power plants are statistically analyzed using the Failure Rate Analysis Code (FRAC). A major purpose of the analysis is to determine which environmental, system, and operating factors adequately explain the variability in the failure data. Catastrophic, degraded, and incipient failure severity categories are considered for both demand-related and time-dependent failures. For catastrophic demand-related pump failures, the variability is explained by the following factors listed in their order of importance: system application, pump driver, operating mode, reactor type, pump type, and unidentified plant-specific influences. Quantitative failure rate adjustments are provided for the effects of these factors. In the case of catastrophic time-dependent pump failures, the failure rate variability is explained by three factors: reactor type, pump driver, and unidentified plant-specific influences. Finally, point and confidence interval failure rate estimates are provided for each selected pump by considering the influential factors. Both types of estimates represent an improvement over the estimates computed exclusively from the data on each pump
Constrained statistical inference: sample-size tables for ANOVA and regression
Directory of Open Access Journals (Sweden)
Leonard eVanbrabant
2015-01-01
Full Text Available Researchers in the social and behavioral sciences often have clear expectations about the order/direction of the parameters in their statistical model. For example, a researcher might expect that regression coefficient beta1 is larger than beta2 and beta3. The corresponding hypothesis is H: beta1 > {beta2, beta3} and this is known as an (order constrained hypothesis. A major advantage of testing such a hypothesis is that power can be gained and inherently a smaller sample size is needed. This article discusses this gain in sample size reduction, when an increasing number of constraints is included into the hypothesis. The main goal is to present sample-size tables for constrained hypotheses. A sample-size table contains the necessary sample-size at a prespecified power (say, 0.80 for an increasing number of constraints. To obtain sample-size tables, two Monte Carlo simulations were performed, one for ANOVA and one for multiple regression. Three results are salient. First, in an ANOVA the needed sample-size decreases with 30% to 50% when complete ordering of the parameters is taken into account. Second, small deviations from the imposed order have only a minor impact on the power. Third, at the maximum number of constraints, the linear regression results are comparable with the ANOVA results. However, in the case of fewer constraints, ordering the parameters (e.g., beta1 > beta2 results in a higher power than assigning a positive or a negative sign to the parameters (e.g., beta1 > 0.
HOW TO SELECT APPROPRIATE STATISTICAL TEST IN SCIENTIFIC ARTICLES
Directory of Open Access Journals (Sweden)
Vladimir TRAJKOVSKI
2016-09-01
Full Text Available Statistics is mathematical science dealing with the collection, analysis, interpretation, and presentation of masses of numerical data in order to draw relevant conclusions. Statistics is a form of mathematical analysis that uses quantified models, representations and synopses for a given set of experimental data or real-life studies. The students and young researchers in biomedical sciences and in special education and rehabilitation often declare that they have chosen to enroll that study program because they have lack of knowledge or interest in mathematics. This is a sad statement, but there is much truth in it. The aim of this editorial is to help young researchers to select statistics or statistical techniques and statistical software appropriate for the purposes and conditions of a particular analysis. The most important statistical tests are reviewed in the article. Knowing how to choose right statistical test is an important asset and decision in the research data processing and in the writing of scientific papers. Young researchers and authors should know how to choose and how to use statistical methods. The competent researcher will need knowledge in statistical procedures. That might include an introductory statistics course, and it most certainly includes using a good statistics textbook. For this purpose, there is need to return of Statistics mandatory subject in the curriculum of the Institute of Special Education and Rehabilitation at Faculty of Philosophy in Skopje. Young researchers have a need of additional courses in statistics. They need to train themselves to use statistical software on appropriate way.
Energy Technology Data Exchange (ETDEWEB)
Diniz, Jose H.; Leao, Sergio L.C.; Paim, Oswaldo; Martins, Jose A.; Costa, Jonas A. da [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil)
1991-12-31
This paper shows a computerized system, that adopt a statistical methodology and simultaneous measurements of wind velocity and environment temperature, as an experiences at the power transmission lines with environmental monitoring systems. 9 figs., 16 refs.
Mende, Denis; Böttger, Diana; Löwer, Lothar; Becker, Holger; Akbulut, Alev; Stock, Sebastian
2018-02-01
The European power grid infrastructure faces various challenges due to the expansion of renewable energy sources (RES). To conduct investigations on interactions between power generation and the power grid, models for the power market as well as for the power grid are necessary. This paper describes the basic functionalities and working principles of both types of models as well as steps to couple power market results and the power grid model. The combination of these models is beneficial in terms of gaining realistic power flow scenarios in the grid model and of being able to pass back results of the power flow and restrictions to the market model. Focus is laid on the power grid model and possible application examples like algorithms in grid analysis, operation and dynamic equipment modelling.
SP-100 - The national space reactor power system program in response to future needs
Armijo, J. S.; Josloff, A. T.; Bailey, H. S.; Matteo, D. N.
The SP-100 system has been designed to meet comprehensive and demanding NASA/DOD/DOE requirements. The key requirements include: nuclear safety for all mission phases, scalability from 10's to 100's of kWe, reliable performance at full power for seven years of partial power for ten years, survivability in civil or military threat environments, capability to operate autonomously for up to six months, capability to protect payloads from excessive radiation, and compatibility with shuttle and expendable launch vehicles. The authors address of major progress in terms of design, flexibility/scalability, survivability, and development. These areas, with the exception of survivability, are discussed in detail. There has been significant improvement in the generic flight system design with substantial mass savings and simplification that enhance performance and reliability. Design activity has confirmed the scalability and flexibility of the system and the ability to efficiently meet NASA, AF, and SDIO needs. SP-100 development continues to make significant progress in all key technology areas.
A robust statistical method for association-based eQTL analysis.
Directory of Open Access Journals (Sweden)
Ning Jiang
Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.
Statistical analyses to support guidelines for marine avian sampling. Final report
Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris
2012-01-01
Interest in development of offshore renewable energy facilities has led to a need for high-quality, statistically robust information on marine wildlife distributions. A practical approach is described to estimate the amount of sampling effort required to have sufficient statistical power to identify species-specific “hotspots” and “coldspots” of marine bird abundance and occurrence in an offshore environment divided into discrete spatial units (e.g., lease blocks), where “hotspots” and “coldspots” are defined relative to a reference (e.g., regional) mean abundance and/or occurrence probability for each species of interest. For example, a location with average abundance or occurrence that is three times larger the mean (3x effect size) could be defined as a “hotspot,” and a location that is three times smaller than the mean (1/3x effect size) as a “coldspot.” The choice of the effect size used to define hot and coldspots will generally depend on a combination of ecological and regulatory considerations. A method is also developed for testing the statistical significance of possible hotspots and coldspots. Both methods are illustrated with historical seabird survey data from the USGS Avian Compendium Database. Our approach consists of five main components: 1. A review of the primary scientific literature on statistical modeling of animal group size and avian count data to develop a candidate set of statistical distributions that have been used or may be useful to model seabird counts. 2. Statistical power curves for one-sample, one-tailed Monte Carlo significance tests of differences of observed small-sample means from a specified reference distribution. These curves show the power to detect "hotspots" or "coldspots" of occurrence and abundance at a range of effect sizes, given assumptions which we discuss. 3. A model selection procedure, based on maximum likelihood fits of models in the candidate set, to determine an appropriate statistical
Statistically-Efficient Filtering in Impulsive Environments: Weighted Myriad Filters
Directory of Open Access Journals (Sweden)
Juan G. Gonzalez
2002-01-01
Full Text Available Linear filtering theory has been largely motivated by the characteristics of Gaussian signals. In the same manner, the proposed Myriad Filtering methods are motivated by the need for a flexible filter class with high statistical efficiency in non-Gaussian impulsive environments that can appear in practice. Myriad filters have a solid theoretical basis, are inherently more powerful than median filters, and are very general, subsuming traditional linear FIR filters. The foundation of the proposed filtering algorithms lies in the definition of the myriad as a tunable estimator of location derived from the theory of robust statistics. We prove several fundamental properties of this estimator and show its optimality in practical impulsive models such as the ÃŽÂ±-stable and generalized-t. We then extend the myriad estimation framework to allow the use of weights. In the same way as linear FIR filters become a powerful generalization of the mean filter, filters based on running myriads reach all of their potential when a weighting scheme is utilized. We derive the Ã¢Â€ÂœnormalÃ¢Â€Â equations for the optimal myriad filter, and introduce a suboptimal methodology for filter tuning and design. The strong potential of myriad filtering and estimation in impulsive environments is illustrated with several examples.
Ion Stopping Powers and Ranges Whenever You Need Them
DEFF Research Database (Denmark)
Bassler, Niels; Christensen, Casper; Tørresø, Jesper Rosholm
A new app "Electronic Stopping Power" for Android mobile phones and tablets, looks up stopping powers using the ICRU 49 (protons and alphas) and the revised ICRU 73 (lithium and heavier ions) tables. In addition, also MSTAR and an implementation of the Bethe equation expanded to low energies...
Environmental accounting and statistics
International Nuclear Information System (INIS)
Bartelmus, P.L.P.
1992-01-01
The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs
Statistical inference for financial engineering
Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki
2014-01-01
This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.
Evaluation of clustering statistics with N-body simulations
International Nuclear Information System (INIS)
Quinn, T.R.
1986-01-01
Two series of N-body simulations are used to determine the effectiveness of various clustering statistics in revealing initial conditions from evolved models. All the simulations contained 16384 particles and were integrated with the PPPM code. One series is a family of models with power at only one wavelength. The family contains five models with the wavelength of the power separated by factors of √2. The second series is a family of all equal power combinations of two wavelengths taken from the first series. The clustering statistics examined are the two point correlation function, the multiplicity function, the nearest neighbor distribution, the void probability distribution, the distribution of counts in cells, and the peculiar velocity distribution. It is found that the covariance function, the nearest neighbor distribution, and the void probability distribution are relatively insensitive to the initial conditions. The distribution of counts in cells show a little more sensitivity, but the multiplicity function is the best of the statistics considered for revealing the initial conditions
Statistical inference based on divergence measures
Pardo, Leandro
2005-01-01
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...
Statistics for non-statisticians
Madsen, Birger Stjernholm
2016-01-01
This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .
Electric power annual 1997. Volume 2
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-10-01
The Electric Power Annual 1997, Volume 2 contains annual summary statistics at national, regional, and state levels for the electric power industry, including information on both electric utilities and nonutility power producers. Included are data for electric utility retail sales of electricity, associated revenue, and average revenue per kilowatthour of electricity sold; financial statistics; environmental statistics; power transactions; and demand-side management. Also included are data for US nonutility power producers on installed capacity; gross generation; emissions; and supply and disposition of energy. The objective of the publication is to provide industry decisionmakers, government policymakers, analysts, and the general public with historical data that may be used in understanding US electricity markets. 15 figs., 62 tabs.
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
A goodness of fit statistic for the geometric distribution
Ferreira, J.A.
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results suggest that the test based on the new statistic is generally superior to the chi-square test.
Energy Technology Data Exchange (ETDEWEB)
Liukkonen, M.; Hentunen, A.; Kyyrae, J. (Department of Electrical Engineering, Helsinki University of Technology, Espoo (Finland)); Suomela, J. (Department of Automation and Systems, Helsinki University of Technology, Espoo (Finland))
2008-07-01
A method for rapid control prototyping of the series-hybrid transmission system is proposed in this paper. The rapid control prototyping needs simulation submodels from all system components in order to develop supervisory control software. The same simulation models can also be used to optimize the drive train. The target framework for the rapid control prototyping method is the original equipment manufacturer (OEM), where the objective is to build devices from subcontractor's components. The machinery industry, as a target group, uses high power ratings for the creation of motion, which leads to high voltage and current values used in the system. Therefore, prototyping is started with careful simulations. This paper also seeks to create a general idea about the structure of the series-hybrid power transmission and assists the start of the process for designing the supervisory control. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Park, Dongwon; Park, Keunteag [Korea Inspection Co., Seoul (Korea, Republic of)
2013-05-15
Many countries have been proceeding research actively in this area as highlighted these advantages but de facto, there are not official research activities in Korea. This paper is to analyze the current status of research in some countries and describe the possibilities introduce with necessity in Korea. Radiation and the radioactive waste, decay heat, loss of coolant accident that related with safety issue are always emerging as a problem of social acceptance of nuclear power plant. Proportion of U-{sub 235}, uranium fuel is only 3-5% and the rest of U-{sub 238} is discarded as all nuclear waste in accordance with 'Open Cycle.' Even if put to practical use thorium reactor, nuclear waste generated but can significantly reduce the toxicity nuclides of TRU and not only the decay heat that physical cause of Fukushima nuclear power plant accident does not occur but also do not need to worry about the LOCA. Our nuclear power plant have been operated very safety compared to Japan. Relative to unexpected operation stop status in terms of base 5 years previous of Fukushima nuclear accident are 0.5 based on average value in our nuclear power plants on the other hand 0.4 in total 17 units of nuclear power plant subsidiary Tokyo Electric Power co., Japan. It is similar to ours, and is similar to Kansai Electric Power Co. However, total reported incidents of non-severe accidents are much higher than ours. But that cannot be interpreted as the safety of our operating skills of nuclear power plant are excellent compared to other countries. For many people's anxiety about nuclear power plants operating after the Fukushima accident, nuclear operators emphasized the safety of our nuclear power plant with referring to the difference between PWR and BWR. Our nuclear power plant were complementary in many facilities depend on problems revealed at the Fukushima nuclear power plant. But it seems to be need for long-term research and development of next-generation nuclear
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2006-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...
BrightStat.com: free statistics online.
Stricker, Daniel
2008-10-01
Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.
Yu, Ningle; Zhang, Yimei; Wang, Jin; Cao, Xingjiang; Fan, Xiangyong; Xu, Xiaosan; Wang, Furu
2012-01-01
The aims of this paper were to determine the level of knowledge of and attitude to nuclear power among residents around Tianwan Nuclear power plant in Jiangsu of China. A descriptive, cross-sectional design was adopted. 1,616 eligible participants who lived around the Tianwan nuclear power plant within a radius of 30km and at least 18 years old were recruited into our study and accepted epidemiological survey. Data were collected through self-administered questionnaires consisting of a socio-demographic sheet. Inferential statistics, t-test, ANOVA test and multivariate regression analysis were used to compare the differences between each subgroup and correlation analysis was conducted to understand the relationship between different factors and dependent variables. Our investigation found that the level of awareness and acceptance of nuclear power was generally not high. Respondents' gender, age, marital status, residence, educational level, family income and the distance away from the nuclear power plant are important effect factors to the knowledge of and attitude to nuclear power. The public concerns about nuclear energy's impact are widespread. The level of awareness and acceptance of nuclear power needs to be improved urgently.
Energy statistics: A manual for developing countries
International Nuclear Information System (INIS)
1991-01-01
Considerable advances have been made by developing countries during the last 20 years in the collection and compilation of energy statistics. the present Manual is a guide, which it is hoped will be used in countries whose system of statistics is less advanced to identify the main areas that should be developed and how this might be achieved. The generally accepted aim is for countries to be able to compile statistics annually on the main characteristics shown for each fuel, and for energy in total. These characteristics are mainly concerned with production, supply and consumption, but others relating to the size and capabilities of the different energy industries may also be of considerable importance. The initial task of collecting data from the energy industries (mines, oil producers, refineries and distributors, electrical power stations, etc.) may well fall to a number of organizations. ''Energy'' from a statistical point of view is the sum of the component fuels, and good energy statistics are therefore dependent on good fuel statistics. For this reason a considerable part of this Manual is devoted to the production of regular, comprehensive and reliable statistics relating to individual fuels. Chapters V to IX of this Manual are concerned with identifying the flows of energy, from production to final consumption, for each individual fuel, and how data on these flows might be expected to be obtained. The very different problems concerned with the collection of data on the flows for biomass fuels are covered in chapter X. The data needed to complete the picture of the national scene for each individual fuel, more concerned with describing the size, capabilities and efficiency of the industries related to that fuel, are discussed in chapter XI. Annex I sets out the relationships between the classifications of the various types of fuels. The compilation of energy balances from the data obtained for individual fuels is covered in chapter XIII. Finally, chapter
Sumner, Jeremy G; Taylor, Amelia; Holland, Barbara R; Jarvis, Peter D
2017-12-01
Recently there has been renewed interest in phylogenetic inference methods based on phylogenetic invariants, alongside the related Markov invariants. Broadly speaking, both these approaches give rise to polynomial functions of sequence site patterns that, in expectation value, either vanish for particular evolutionary trees (in the case of phylogenetic invariants) or have well understood transformation properties (in the case of Markov invariants). While both approaches have been valued for their intrinsic mathematical interest, it is not clear how they relate to each other, and to what extent they can be used as practical tools for inference of phylogenetic trees. In this paper, by focusing on the special case of binary sequence data and quartets of taxa, we are able to view these two different polynomial-based approaches within a common framework. To motivate the discussion, we present three desirable statistical properties that we argue any invariant-based phylogenetic method should satisfy: (1) sensible behaviour under reordering of input sequences; (2) stability as the taxa evolve independently according to a Markov process; and (3) explicit dependence on the assumption of a continuous-time process. Motivated by these statistical properties, we develop and explore several new phylogenetic inference methods. In particular, we develop a statistically bias-corrected version of the Markov invariants approach which satisfies all three properties. We also extend previous work by showing that the phylogenetic invariants can be implemented in such a way as to satisfy property (3). A simulation study shows that, in comparison to other methods, our new proposed approach based on bias-corrected Markov invariants is extremely powerful for phylogenetic inference. The binary case is of particular theoretical interest as-in this case only-the Markov invariants can be expressed as linear combinations of the phylogenetic invariants. A wider implication of this is that, for
Applied statistics a handbook of BMDP analyses
Snell, E J
1987-01-01
This handbook is a realization of a long term goal of BMDP Statistical Software. As the software supporting statistical analysis has grown in breadth and depth to the point where it can serve many of the needs of accomplished statisticians it can also serve as an essential support to those needing to expand their knowledge of statistical applications. Statisticians should not be handicapped by heavy computation or by the lack of needed options. When Applied Statistics, Principle and Examples by Cox and Snell appeared we at BMDP were impressed with the scope of the applications discussed and felt that many statisticians eager to expand their capabilities in handling such problems could profit from having the solutions carried further, to get them started and guided to a more advanced level in problem solving. Who would be better to undertake that task than the authors of Applied Statistics? A year or two later discussions with David Cox and Joyce Snell at Imperial College indicated that a wedding of the proble...
Ma, Li-Xin; Liu, Jian-Ping
2012-01-01
To investigate whether the power of the effect size was based on adequate sample size in randomized controlled trials (RCTs) for the treatment of patients with type 2 diabetes mellitus (T2DM) using Chinese medicine. China Knowledge Resource Integrated Database (CNKI), VIP Database for Chinese Technical Periodicals (VIP), Chinese Biomedical Database (CBM), and Wangfang Data were systematically recruited using terms like "Xiaoke" or diabetes, Chinese herbal medicine, patent medicine, traditional Chinese medicine, randomized, controlled, blinded, and placebo-controlled. Limitation was set on the intervention course > or = 3 months in order to identify the information of outcome assessement and the sample size. Data collection forms were made according to the checking lists found in the CONSORT statement. Independent double data extractions were performed on all included trials. The statistical power of the effects size for each RCT study was assessed using sample size calculation equations. (1) A total of 207 RCTs were included, including 111 superiority trials and 96 non-inferiority trials. (2) Among the 111 superiority trials, fasting plasma glucose (FPG) and glycosylated hemoglobin HbA1c (HbA1c) outcome measure were reported in 9% and 12% of the RCTs respectively with the sample size > 150 in each trial. For the outcome of HbA1c, only 10% of the RCTs had more than 80% power. For FPG, 23% of the RCTs had more than 80% power. (3) In the 96 non-inferiority trials, the outcomes FPG and HbA1c were reported as 31% and 36% respectively. These RCTs had a samples size > 150. For HbA1c only 36% of the RCTs had more than 80% power. For FPG, only 27% of the studies had more than 80% power. The sample size for statistical analysis was distressingly low and most RCTs did not achieve 80% power. In order to obtain a sufficient statistic power, it is recommended that clinical trials should establish clear research objective and hypothesis first, and choose scientific and evidence
SIESE - trimestrial bulletin - Synthesis 1994. Electric power summary statistics for Brazil
International Nuclear Information System (INIS)
1994-01-01
The performance of the power system of all the Brazilian electrical utilities is presented. The data is given for each region in the country and includes, among other things, the electric power consumption and generation; the number of consumers and the electric power rates. 10 figs., 42 tabs
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-01-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471
Permutation statistical methods an integrated approach
Berry, Kenneth J; Johnston, Janis E
2016-01-01
This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...
Quantum fluctuation theorems and power measurements
International Nuclear Information System (INIS)
Prasanna Venkatesh, B; Watanabe, Gentaro; Talkner, Peter
2015-01-01
Work in the paradigm of the quantum fluctuation theorems of Crooks and Jarzynski is determined by projective measurements of energy at the beginning and end of the force protocol. In analogy to classical systems, we consider an alternative definition of work given by the integral of the supplied power determined by integrating up the results of repeated measurements of the instantaneous power during the force protocol. We observe that such a definition of work, in spite of taking account of the process dependence, has different possible values and statistics from the work determined by the conventional two energy measurement approach (TEMA). In the limit of many projective measurements of power, the system’s dynamics is frozen in the power measurement basis due to the quantum Zeno effect leading to statistics only trivially dependent on the force protocol. In general the Jarzynski relation is not satisfied except for the case when the instantaneous power operator commutes with the total Hamiltonian at all times. We also consider properties of the joint statistics of power-based definition of work and TEMA work in protocols where both values are determined. This allows us to quantify their correlations. Relaxing the projective measurement condition, weak continuous measurements of power are considered within the stochastic master equation formalism. Even in this scenario the power-based work statistics is in general not able to reproduce qualitative features of the TEMA work statistics. (paper)
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Buildings cover a part of the own power needs
International Nuclear Information System (INIS)
Schmid, W.
2010-01-01
This article takes a look at the topic of 'Smart Grids', as dealt with in the USA. The author notes that the topic is mostly the subject of work groups and the Administration, but that many interest groups are dealing with this 'innovation of the century'. It is, however, not always clear what exactly is meant by the term 'Smart Grid'. The article deals with the combination of a smart grid with a smart building, with interaction between the power supply and building technical services. In particular the BAC-Net system in the smart grid environment is dealt with in an interview with H. Michael Newmann of Cornell University, New York as well as with Steven T. Bushby and David Holmberg of the National Institute of Standards and Technology NIST. Topics discussed include interest shown by power utilities, state and market support, the integration of power-pricing data in the BAC-Net protocol, power 'stock exchanges' and electricity pricing as well as interaction between the utility and building management systems. Also, smart grids for normal households are discussed, as are a road map for the introduction of the technology and financial advantages for building managers
International Nuclear Information System (INIS)
Dato Syed Ahmad Idid, S.N. K. A.-I.
2015-01-01
Business should not be as usual in formulating strategies and plans to enhance awareness regarding the benefits of nuclear power as an option for energy mix. Although, presently 435 nuclear power reactors in operation in 30 countries are delivering cost competitive electricity to consumers, creating significant job, investment and business opportunities, supporting enterprises, contributing significantly to these nations economic growth, however these positive impacts and benefits have not be sufficiently transmitted to the various stakeholders and population, who have until recently only received unbalanced views and news from an uninformed press. Negative and generally unbalanced press coverage of isolated nuclear incidents and accidents such as TMI, Chernobyl and most recently Fukushima has resulted in public protests to nuclear power, contributing to several nuclear power programmes being delayed or not able to take off. This situation is further exacerbated by uninformed politicians and policy makers who have the influence but were not able to harness their positions to assure the public due to lack of knowledge regarding the economic and social benefits of nuclear power. As the challenges to the nuclear industry presently also include ageing nuclear professionals, lack of updates regarding business and financing opportunities to business and financing professionals, thus the benefits of career, business and financing opportunities must also be disseminated to these Professionals. This paper aims to highlight the fundamental need to expand present Public Awareness Programme to become the 5Ps (Politicians, Policy makers, Professionals, Public and Press) Awareness Programme on Nuclear Power. (author)
Enrichment of statistical power for genome-wide association studies
The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...
Energy Technology Data Exchange (ETDEWEB)
Batra, R.K. (ed.)
2012-12-15
This policy brief discusses the challenges of water availability and opportunity to improve the water use efficiency in industries specially the thermal power plants. It presents TERI’s experience from comprehensive water audits conducted for thermal power plants in India. The findings indicate that there is a significant scope for saving water in the waste water discharge, cooling towers, ash handling systems, and the township water supply. Interventions like recycling wastewater, curbing leakages, increasing CoC (Cycles of concentration) in cooling towers, using dry ash handling etc., can significantly reduce the specific water consumption in power plants. However, the first step towards this is undertaking regular water audits. The policy brief highlights the need of mandatory water audits necessary to understand the current water use and losses as well as identify opportunities for water conservation, reduction in specific water consumption, and an overall improvement in water use efficiency in industries.
Analysis of statistical misconception in terms of statistical reasoning
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
Directory of Open Access Journals (Sweden)
Rochelle E. Tractenberg
2016-12-01
Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.
Manpower requirements in the nuclear power industry, 1982-1991
International Nuclear Information System (INIS)
Johnson, R.C.
1982-09-01
The objective of this study is to project occupational employment needs, created by growth and employee turnover, for the nuclear power industry over the next decade. Employment data for 1981 were collected in a survey conducted by the Institute of Nuclear Power Operations of its 60 member utilities. The data were analyzed statistically to identify factors that account for variations in power plant staffing and the number of off-site nuclear support personnel employed by a utility. Total employment in the nuclear power industry is predicted to increase from 54,400 in 1981 to 73,600 in 1991. Nuclear generating capacity will increase from 58 to 124 gigawatts, based on the midline forecast of the Energy Information Administration. The projections assume that current regulations will remain in effect and no new plans for additional generating facilities will be initiated
Electric power annual 1989. [Contains glossary
Energy Technology Data Exchange (ETDEWEB)
1991-01-17
This publication presents a summary of electric utility statistics at the national, regional and state levels. The Industry At A Glance'' section presents a profile of the electric power industry ownership and performance; a review of key statistics for the year; and projections for various aspects of the electric power industry through 2010. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; electricity sales, revenue and average revenue per kilowatthour sold; financial statistics; environmental statistics; and electric power transactions. In addition, the appendices provide supplemental data on major disturbances and unusual occurrences. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter. 24 figs., 57 tabs.
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Dunn, Karee
2014-01-01
Online graduate education programs are expanding rapidly. Many of these programs require a statistics course, resulting in an increasing need for online statistics courses. The study reported here grew from experiences teaching online, graduate statistics courses. In seeking answers on how to improve this class, I discovered that research has yet…
Energy statistics of pre-revolutionary Russia
Directory of Open Access Journals (Sweden)
N. S. Simonov
2017-01-01
Full Text Available The article is devoted to the problem of creation and development of the energy statistics of the Russian Empire of the initial stage of electrification and the formation of the energy economy, which is related to: 1 the economic upsurge of the 1890s; 2 the new economic recovery of 1907–1913 and 3 the militarization of industry in 1914–1916. The real technical and economic indicators and complex statistical data of the pre-revolutionary electric power industry were either hushed up or deliberately distorted during the Soviet era. Even in the encyclopaedic literature it was considered that pre-revolutionaryRussia“was on one of the last places in the world” for the production of electricity. The author analyzes statistical surveys (censuses of the manufacturing industry for 1900 and 1908 (the “varzar censuses”, which gave the first material on the state of its energy sector, namely: the composition, quantity and power of primary engines and electric motors. For the first time in historiography, the data of the “energy censuses” of the Ministry of Finance for 1905 and 1913 on the number and capacity of central public power stations and private power stations (block stations of industrial enterprises, organizations and institutions are cited. The data of the census were conducted with the participation of the apparatus of factory and factory inspections in 1906 and in1916 inall provinces of the Russian Empire, with the exception of six provinces of the frontline zone. A lot of work was done to record electricity production / consumption, which was conducted by the Russian electrotechnical community. According to incomplete data published in 1917 by the Secretariat of the Standing Committee of the VII All-Russia Electrotechnical Congress, from 1905 to 1913 (that is, for 8 years the total number of power stations in the Russian Empire increased by 1.7 times, and the amount of electricity produced by them Has grown in 3,8 times. The
Directory of Open Access Journals (Sweden)
Necka Krzysztof
2018-01-01
Full Text Available The aim of the study was to analyse the energy needs of selected consumers and to select the PV installed capacity in the east and west panels orientation (E-W. The analysis was carried out for two variants. The first concerned the impact of the installation power of a PV source on the contracted power in its symmetric east and west position. The second variant was the result of previous observations and studies of the authors of this paper. Thus, in this variant, the unbalanced power of PV E-W in relation to the S azimuth has been analyzed, taking into account the characteristics of the recipients' needs and the variation of the PV power installed. The analyses show that an increase in meeting the energy demand for two of the tested plants at symmetrical power distribution to the east and west occurred only when the installed power was increased to the level of approx. 1.4-1.6 of the contracted power. However, the power distribution in the E-W direction has a very strong effect on the energy amount that is generated in the power plant and cannot be used due to the lack of demand in the plant.
Gibbs' theorem for open systems with incomplete statistics
International Nuclear Information System (INIS)
Bagci, G.B.
2009-01-01
Gibbs' theorem, which is originally intended for canonical ensembles with complete statistics has been generalized to open systems with incomplete statistics. As a result of this generalization, it is shown that the stationary equilibrium distribution of inverse power law form associated with the incomplete statistics has maximum entropy even for open systems with energy or matter influx. The renormalized entropy definition given in this paper can also serve as a measure of self-organization in open systems described by incomplete statistics.
State analysis of BOP using statistical and heuristic methods
International Nuclear Information System (INIS)
Heo, Gyun Young; Chang, Soon Heung
2003-01-01
Under the deregulation environment, the performance enhancement of BOP in nuclear power plants is being highlighted. To analyze performance level of BOP, we use the performance test procedures provided from an authorized institution such as ASME. However, through plant investigation, it was proved that the requirements of the performance test procedures about the reliability and quantity of sensors was difficult to be satisfied. As a solution of this, state analysis method that are the expanded concept of signal validation, was proposed on the basis of the statistical and heuristic approaches. Authors recommended the statistical linear regression model by analyzing correlation among BOP parameters as a reference state analysis method. Its advantage is that its derivation is not heuristic, it is possible to calculate model uncertainty, and it is easy to apply to an actual plant. The error of the statistical linear regression model is below 3% under normal as well as abnormal system states. Additionally a neural network model was recommended since the statistical model is impossible to apply to the validation of all of the sensors and is sensitive to the outlier that is the signal located out of a statistical distribution. Because there are a lot of sensors need to be validated in BOP, wavelet analysis (WA) were applied as a pre-processor for the reduction of input dimension and for the enhancement of training accuracy. The outlier localization capability of WA enhanced the robustness of the neural network. The trained neural network restored the degraded signals to the values within ±3% of the true signals
Directory of Open Access Journals (Sweden)
CHEN, Z.
2014-11-01
Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.
Nonparametric statistics a step-by-step approach
Corder, Gregory W
2014-01-01
"…a very useful resource for courses in nonparametric statistics in which the emphasis is on applications rather than on theory. It also deserves a place in libraries of all institutions where introductory statistics courses are taught."" -CHOICE This Second Edition presents a practical and understandable approach that enhances and expands the statistical toolset for readers. This book includes: New coverage of the sign test and the Kolmogorov-Smirnov two-sample test in an effort to offer a logical and natural progression to statistical powerSPSS® (Version 21) software and updated screen ca
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Introduction to Statistics for Biomedical Engineers
Ropella, Kristina
2007-01-01
There are many books written about statistics, some brief, some detailed, some humorous, some colorful, and some quite dry. Each of these texts is designed for a specific audience. Too often, texts about statistics have been rather theoretical and intimidating for those not practicing statistical analysis on a routine basis. Thus, many engineers and scientists, who need to use statistics much more frequently than calculus or differential equations, lack sufficient knowledge of the use of statistics. The audience that is addressed in this text is the university-level biomedical engineering stud
Situation Aware Assessment of Regulating Power Need and Resource
DEFF Research Database (Denmark)
Heussen, Kai
2009-01-01
power plants, but have the capability to provide a number of ancillary services. It is envisioned that wind power may at times provide a certain share of system stabilization, but it must also be seen that this contribution is limited to only a part of the required functions and that it fluctuates...... with the available wind. The approach proposed in this paper uses a functional classification to sort out the control requirements of a power system with a high share of fluctuating renewable and distributed energy sources and aims to combine it with a structured quantitative assessment.......Distributed generation and renewable energy sources are both, new disturbance and new regulation resource. Which it is, depends to a large extend on the facilitation of control capabilities, that for example modern wind turbines can provide. Most renewable energy sources are quite unlike classical...
International Nuclear Information System (INIS)
Maskewitz, B.F.; Trubey, D.K.; Roussin, R.W.; McGill, B.L.
1976-04-01
The Radiation Shielding Information Center (RSIC) is engaged in a program to seek out, organize, and disseminate information in the area of radiation transport, shielding, and radiation protection. This information consists of published literature, nuclear data, and computer codes and advanced analytical techniques required by ERDA, its contractors, and the nuclear power industry to improve radiation analysis and computing capability. Information generated in this effort becomes a part of the RSIC collection and/or data base. The purpose of this report on project 219-1 is to document the results of the survey of information and computer code needs of the nuclear power industry in the area of radiation analysis and protection
Energy Technology Data Exchange (ETDEWEB)
Maskewitz, B.F.; Trubey, D.K.; Roussin, R.W.; McGill, B.L.
1976-04-01
The Radiation Shielding Information Center (RSIC) is engaged in a program to seek out, organize, and disseminate information in the area of radiation transport, shielding, and radiation protection. This information consists of published literature, nuclear data, and computer codes and advanced analytical techniques required by ERDA, its contractors, and the nuclear power industry to improve radiation analysis and computing capability. Information generated in this effort becomes a part of the RSIC collection and/or data base. The purpose of this report on project 219-1 is to document the results of the survey of information and computer code needs of the nuclear power industry in the area of radiation analysis and protection.
Statistical thermodynamics of nonequilibrium processes
Keizer, Joel
1987-01-01
The structure of the theory ofthermodynamics has changed enormously since its inception in the middle of the nineteenth century. Shortly after Thomson and Clausius enunciated their versions of the Second Law, Clausius, Maxwell, and Boltzmann began actively pursuing the molecular basis of thermo dynamics, work that culminated in the Boltzmann equation and the theory of transport processes in dilute gases. Much later, Onsager undertook the elucidation of the symmetry oftransport coefficients and, thereby, established himself as the father of the theory of nonequilibrium thermodynamics. Com bining the statistical ideas of Gibbs and Langevin with the phenomenological transport equations, Onsager and others went on to develop a consistent statistical theory of irreversible processes. The power of that theory is in its ability to relate measurable quantities, such as transport coefficients and thermodynamic derivatives, to the results of experimental measurements. As powerful as that theory is, it is linear and...
Why do we need nuclear power? Energy policy in the light of history of civilization
International Nuclear Information System (INIS)
Yoda, Susumu.
1996-01-01
With the population explosion as a background, economic growth needs massive consumption of energy and resources. This massive consumption of energy and resources will deteriorate the global environment. It is a complicated chain of causes and effects. The problems of economic growth, resources and energy, and environment must be solved at the same time. Here the so-called ''Trilemma'' problem emerges. To overcome the Trilemma and assure a sustainable development of the whole world, approaches and actions are needed from various viewpoints including technology, socio-economic system and civilization. From the viewpoint of energy, it will be necessary to introduce all energy technologies which will not deteriorate the global environment. Energy conservation and efficiency are an important part of this process. It is also important to introduce renewable energy as much as possible. Even with these efforts, the energy needed by mankind in the 21st century will be tremendous. An energy source is needed which is adequate in terms of quantity, price, and environment. It is nuclear energy that meets these requirements. Several problems must be solved before the fundamental important merit of nuclear power can be realized. These issues are discussed here. They are divided into the following categories: economic issues; technical issues; social issues; political issues; and the issues in Asia
Market oriented and rational use of energy and power. Level of competence and need of skill
International Nuclear Information System (INIS)
Morch, Andrei Z.; Groenli, Helle
2001-03-01
This report surveys the existing research and development (R and D) skill in Norway in the field of ''Market oriented and rational use of energy and power''. The need for skills upgrading and future R and D is discussed. Four areas for R and D are identified as especially important: (1) external conditions, (2) end user behavior, (3) the interplay of all issues related to the end user's competence, and (4) information and communication technology
Comparison of power curve monitoring methods
Directory of Open Access Journals (Sweden)
Cambron Philippe
2017-01-01
Full Text Available Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM of wind turbines (WT. In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.
IBM SPSS statistics 19 made simple
Gray, Colin D
2012-01-01
This new edition of one of the most widely read textbooks in its field introduces the reader to data analysis with the most powerful and versatile statistical package on the market: IBM SPSS Statistics 19. Each new release of SPSS Statistics features new options and other improvements. There remains a core of fundamental operating principles and techniques which have continued to apply to all releases issued in recent years and have been proved to be worth communicating in a small volume. This practical and informal book combines simplicity and clarity of presentation with a comprehensive trea
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
Virginia Power thermal-hydraulics methods
International Nuclear Information System (INIS)
Anderson, R.C.; Basehore, K.L.; Harrell, J.R.
1987-01-01
Virginia Power's nuclear safety analysis group is responsible for the safety analysis of reload cores for the Surry and North Anna power stations, including the area of core thermal-hydraulics. Postulated accidents are evaluated for potential departure from nucleate boiling violations. In support of these tasks, Virginia Power has employed the COBRA code and the W-3 and WRB-1 DNB correlations. A statistical DNBR methodology has also been developed. The code, correlations and statistical methodology are discussed
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Nuclear power and other thermal power
International Nuclear Information System (INIS)
Bakke, J.
1978-01-01
Some philosophical aspects of mortality statistics are first briefly mentioued, then the environmental problems of, first, nuclear power plants, then fossil fuelled power plants are summarised. The effects of releases of carbon dioxide, sulphur dioxide and nitrogen oxides are briefly discussed. The possible health effects of radiation from nuclear power plants and those of gaseous and particulate effluents from fossil fuel plants are also discussed. It is pointed out that in choosing between alternative evils the worst course is to make no choice at all, that is, failure to install thermal power plants will lead to isolated domestic burning of fossil fuels which is clearly the worst situation regarding pollution. (JIW)
Powerful Inference With the D-Statistic on Low-Coverage Whole-Genome Data
DEFF Research Database (Denmark)
Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders
2018-01-01
The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness...... is assessed by evaluating specific coincidences of alleles between the groups. When working with high throughput sequencing data calling genotypes accurately is not always possible, therefore the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring...... much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction...
Nuclear power needs international solidarity and cooperation
International Nuclear Information System (INIS)
Anon.
1987-01-01
A report by Dr. Blix, director-general of IAEA, to the General Assembly of the United Nations is summarized. Some 15% of the world's entire power requirements are produced from nuclear energy. Thorough inspections, carried out at regular intervals, have not detected removal of any of this nuclear material for military purposes. Cooperation between governments is essential to prevent accidents and improve the safety of nuclear technology. (J.S.)
International Nuclear Information System (INIS)
Isaksson, S.
1996-01-01
This literature study has been made on behalf of the Swedish Nuclear Power Inspectorate. The aim is to describe different aspects of fire protection in nuclear power plants. Detection and extinguishing systems in Swedish nuclear power plants have only to a limited extent been designed after functional demands, such as a maximum acceptable damage or a maximum time to detect a fire. The availability of detection systems is difficult to assess, partly because of lack of statistics. The user interface is very important in complex systems as nuclear plants. An extinguishing system designed according to the insurance companies' regulations will only fulfill the basic demands. It should be noted that normal sprinkler design does not aim for extinguishing fires, the objective is to control fire until manual extinguishment is possible. There is a great amount of statistics on wet and dry pipe sprinkler systems, while statistics are more scarce for deluge systems. The statistics on the reliability of gaseous extinguishing systems have been found very scarce. A drawback of these systems is that they are normally designed for one shot only. There are both traditional and more recent extinguishing systems that can replace halons. From now on there will be a greater need for a thorough examination of the properties needed for the individual application and a quantification of the acceptable damage. There are several indications on the importance of a high quality maintenance program as well as carefully developed routines for testing and surveillance to ensure the reliability of detection and extinguishing systems. 78 refs, 8 figs, 10 tabs
Common pitfalls in statistical analysis: “No evidence of effect” versus “evidence of no effect”
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
This article is the first in a series exploring common pitfalls in statistical analysis in biomedical research. The power of a clinical trial is the ability to find a difference between treatments, where such a difference exists. At the end of the study, the lack of difference between treatments does not mean that the treatments can be considered equivalent. The distinction between “no evidence of effect” and “evidence of no effect” needs to be understood. PMID:25657905
12 CFR 268.601 - EEO group statistics.
2010-01-01
... 12 Banks and Banking 3 2010-01-01 2010-01-01 false EEO group statistics. 268.601 Section 268.601... RULES REGARDING EQUAL OPPORTUNITY Matters of General Applicability § 268.601 EEO group statistics. (a... solely statistical purpose for which the data is being collected, the need for accuracy, the Board's...
An estimator for statistical anisotropy from the CMB bispectrum
International Nuclear Information System (INIS)
Bartolo, N.; Dimastrogiovanni, E.; Matarrese, S.; Liguori, M.; Riotto, A.
2012-01-01
Various data analyses of the Cosmic Microwave Background (CMB) provide observational hints of statistical isotropy breaking. Some of these features can be studied within the framework of primordial vector fields in inflationary theories which generally display some level of statistical anisotropy both in the power spectrum and in higher-order correlation functions. Motivated by these observations and the recent theoretical developments in the study of primordial vector fields, we develop the formalism necessary to extract statistical anisotropy information from the three-point function of the CMB temperature anisotropy. We employ a simplified vector field model and parametrize the bispectrum of curvature fluctuations in such a way that all the information about statistical anisotropy is encoded in some parameters λ LM (which measure the anisotropic to the isotropic bispectrum amplitudes). For such a template bispectrum, we compute an optimal estimator for λ LM and the expected signal-to-noise ratio. We estimate that, for f NL ≅ 30, an experiment like Planck can be sensitive to a ratio of the anisotropic to the isotropic amplitudes of the bispectrum as small as 10%. Our results are complementary to the information coming from a power spectrum analysis and particularly relevant for those models where statistical anisotropy turns out to be suppressed in the power spectrum but not negligible in the bispectrum
Statistical calculation of hot channel factors
International Nuclear Information System (INIS)
Farhadi, K.
2007-01-01
It is a conventional practice in the design of nuclear reactors to introduce hot channel factors to allow for spatial variations of power generation and flow distribution. Consequently, it is not enough to be able to calculate the nominal temperature distributions of fuel element, cladding, coolant, and central fuel. Indeed, one must be able to calculate the probability that the imposed temperature or heat flux limits in the entire core is not exceeded. In this paper, statistical methods are used to calculate hot channel factors for a particular case of a heterogeneous, Material Testing Reactor (MTR) and compare the results obtained from different statistical methods. It is shown that among the statistical methods available, the semi-statistical method is the most reliable one
U.S. nuclear plant statistics, 8th Edition
International Nuclear Information System (INIS)
Anon.
1993-01-01
Wolf Creek was the lowest cost nuclear plant in 1992 according to the annual plant rankings in UDI's comprehensive annual statistical factbook for US nuclear power plants (operating, under construction, deferred, canceled or retired). The book covers operating and maintenance expenses for the past year (1992), annual and lifetime performance statistics, capitalization expenses and changes in capitalization, construction cost information, joint ownership of plants and canceled plants. First published for CY1984 statistics
Contribution of nuclear power to India's energy needs
Energy Technology Data Exchange (ETDEWEB)
Kati, S. L.
1980-03-15
The development of nuclear power in India is reviewed. The following plants are discussed: Tarapur, Rajasthan, Madras, and Narora. Performance of Tarapur and Rajasthan is also reviewed briefly. Cost, safety, and future program are discussed. 4 tables. (DLC)
The need for nuclear power. Viewpoint on the world's challenging energy future
International Nuclear Information System (INIS)
Rhodes, R.; Beller, D.
2000-01-01
To meet the world's growing need for energy, the Royal Society and Royal Academy report proposes 'the formation of an international body for energy research and development, funded by contributions from individual nations on the basis of Gross Domestic Product (GDP) or total national energy consumption'. The body would be 'a funding agency supporting research, development and demonstrators elsewhere, not a research center itself'. Its budget might build to an annual level of some $25 billion, 'roughly 1% of the total global energy budget'. If it truly wants to develop efficient and responsible energy supplies, such a body should focus on the nuclear option, on establishing a secure international nuclear-fuel storage and reprocessing system, and on providing expertise for siting, financing, and licensing modular nuclear power systems to developing nations. According to authors, who study the dynamics of energy technologies, 'the share of energy supplied by electricity is growing rapidly in most countries and worldwide'. Throughout history, humankind has gradually decarbonized its dominant fuels, moving steadily away from the more polluting, carbon-rich sources. Thus the world has gone from coal (which has one hydrogen atom per carbon atom and was dominant from 1880 to 1950) to oil (with two hydrogens per carbon, dominant from 1950 to today). Natural gas (four hydrogens per carbon) is steadily increasing its market share. But nuclear fission produces no carbon at all. Physical reality - not arguments about corporate greed, hypothetical risks, radiation exposure, or waste disposal - ought to inform decisions vital to the future of the world. Because diversity and redundancy are important for safety and security, renewable energy sources ought to retain a place in the energy economy of the century to come. But nuclear power should be central. Despite its outstanding record, it has instead been relegated by its opponents to the same twilight zone of contentious
International Nuclear Information System (INIS)
Caetano de Souza, Antonio Carlos
2008-01-01
The Brazilian relief, predominantly composed by small mountains and plateaus, contributed to formation of rivers with high amount of falls. With exception to North-eastern Brazil, the climate of this country are rainy, which contributes to maintain water flows high. These elements are essential to a high hydroelectric potential, contributing to the choice of hydroelectric power plants as the main technology of electricity generation in Brazil. Though this is a renewable source, whose utilized resource is free, dams must to be established which generates a high environmental and social impact. The objective of this study is to evaluate the impact caused by these dams through the use of environmental indexes. These indexes are ratio formed by installed power with dam area of a hydro power plant, and ratio formed by firm power with this dam area. In this study, the greatest media values were found in South, Southeast, and Northeast regions respectively, and the smallest media values were found in North and Mid-West regions, respectively. The greatest encountered media indexes were also found in dams established in the 1950s. In the last six decades, the smallest indexes were registered by dams established in the 1980s. These indexes could be utilized as important instruments for environmental impact assessments, and could enable a dam to be established that depletes an ecosystem as less as possible. (author)
Calculation Software versus Illustration Software for Teaching Statistics
DEFF Research Database (Denmark)
Mortensen, Peter Stendahl; Boyle, Robin G.
1999-01-01
As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
Energy Technology Data Exchange (ETDEWEB)
NONE
2010-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Statistics for clinical nursing practice: an introduction.
Rickard, Claire M
2008-11-01
Difficulty in understanding statistics is one of the most frequently reported barriers to nurses applying research results in their practice. Yet the amount of nursing research published each year continues to grow, as does the expectation that nurses will undertake practice based on this evidence. Critical care nurses do not need to be statisticians, but they do need to develop a working knowledge of statistics so they can be informed consumers of research and so practice can evolve and improve. For those undertaking a research project, statistical literacy is required to interact with other researchers and statisticians, so as to best design and undertake the project. This article is the first in a series that guides critical care nurses through statistical terms and concepts relevant to their practice.
The Big Mac Standard: A statistical Illustration
Yukinobu Kitamura; Hiroshi Fujiki
2004-01-01
We demonstrate a statistical procedure for selecting the most suitable empirical model to test an economic theory, using the example of the test for purchasing power parity based on the Big Mac Index. Our results show that supporting evidence for purchasing power parity, conditional on the Balassa-Samuelson effect, depends crucially on the selection of models, sample periods and economies used for estimations.
The Effect of Medical Socialization on Medical Students' Need for Power.
Kressin, Nancy R.
1996-01-01
Examines whether the individual personality characteristic of power motivation increases during medical school. Recorded interviews with a diverse group of medical students at two points in time were coded for power motivation. Results showed that white students' power motivation decreased, whereas minority students' levels remained the same,…
Nuclear power plant performance statistics. Comparison with fossil-fired units
International Nuclear Information System (INIS)
Tabet, C.; Laue, H.J.; Qureshi, A.; Skjoeldebrand, R.; White, D.
1983-01-01
The joint UNIPEDE/World Energy Conference Committee on Availability of Thermal Generating Plants has a mandate to study the availability of thermal plants and the different factors that influence it. This has led to the collection and publication at the Congress of the World Energy Conference (WEC) every third year of availability and unavailability factors to be used in systems reliability studies and operations and maintenance planning. For nuclear power plants the joint UNIPEDE/WEC Committee relies on the IAEA to provide availability and unavailability data. The IAEA has published an annual report with operating data from nuclear plants in its Member States since 1971, covering in addition back data from the early 1960s. These reports have developed over the years and in the early 1970s the format was brought into close conformity with that used by UNIPEDE and WEC to report performance of fossil-fired generating plants. Since 1974 an annual analytical summary report has been prepared. In 1981 all information on operating experience with nuclear power plants was placed in a computer file for easier reference. The computerized Power Reactor Information System (PRIS) ensures that data are easily retrievable and at its present level it remains compatible with various national systems. The objectives for the IAEA data collection and evaluation have developed significantly since 1970. At first, the IAEA primarily wanted to enable the individual power plant operator to compare the performance of his own plant with that of others of the same type; when enough data had been collected, they provided the basis for assessment of the fundamental performance parameters used in economic project studies; now, the data base merits being used in setting availability objectives for power plant operations. (author)
Power-Law Statistics of Driven Reconnection in the Magnetically Closed Corona
Klimchuk, J. A.; DeVore, C. R.; Knizhnik, K. J.; Uritskiy, V. M.
2018-01-01
Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.
Power-law Statistics of Driven Reconnection in the Magnetically Closed Corona
Knizhnik, K. J.; Uritsky, V. M.; Klimchuk, J. A.; DeVore, C. R.
2018-01-01
Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Generation-distribution of electric power in France and in its regions in 2004 and 2005
International Nuclear Information System (INIS)
2007-09-01
This report presents the temporary results for the year 2005 of the power generation and transportation/distribution in France. Power generation data come from the official exhaustive annual inquiry acknowledged by the national council of statistical information (CNIS). It concerns all power producers, like EDF, Compagnie Nationale du Rhone (CNR), Societe Nationale d'Electricite et de Thermique (SNET), Societe Hydroelectrique du Midi (SHEM), and some independent producers who generate electricity for their own needs or for the supply of the grid (about 3000 companies). Data relative to power transportation and distribution are established by another administrative annual inquiry addressed to utilities (EDF-Reseau de Distribution, local distribution companies) and to the power transportation network manager (RTE EDF-Transport). The statistical results presented in this document are established from data settled on June 11, 2007. In a context of market liberalization, these two inquiries are of particular importance. They represent a measurement tool of the public authorities' action in favor of the security of supplies, and they allow to have available a detailed, reliable and regularly updated description of France's power generation and transportation facilities by energy source and by geographical area. (J.S.)
International Nuclear Information System (INIS)
Mittendorfer, J.; Zwanziger, P.
2000-01-01
High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfill special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfill these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes. (author)
Tips and Tricks for Successful Application of Statistical Methods to Biological Data.
Schlenker, Evelyn
2016-01-01
This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.
Securing wide appreciation of health statistics.
PYRRAIT A M DO, A; AUBENQUE, M J; BENJAMIN, B; DE GROOT, M J; KOHN, R
1954-01-01
All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the "consumers". At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why.There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians.
Connection between recurrence time statistics and anomalous transport
International Nuclear Information System (INIS)
Zaslavsky, G.M.; Tippett, M.K.
1991-01-01
For a model stationary flow with hexagonal symmetry, the recurrence time statistics are studied. The model has been shown to have a sharp transition from normal to anomalous transport. Here it is shown that this transition is accompanied by a correspondent change of the recurrence time statistics from normal to anomalous. The latter one displays the existence of a power tail. Recurrence time statistics provide a local measurement of anomalous transport that is of practical interest
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
The energy behind the power. Southwestern Power Administration 1994 annual report
Energy Technology Data Exchange (ETDEWEB)
NONE
1994-12-31
This is the Southwestern Power Administration 1994 annual report. The topics of the report include a letter to the secretary; an overview including the mission statement, a description of the Southwestern Federal Power System, financial statement, performance measurements, national performance review; year in review, summary of results, financial and statistical data and the Southwestern Power Administration Organization.
An analysis of nuclear power plant operating costs
International Nuclear Information System (INIS)
1988-01-01
This report presents the results of a statistical analysis of nonfuel operating costs for nuclear power plants. Most studies of the economic costs of nuclear power have focused on the rapid escalation in the cost of constructing a nuclear power plant. The present analysis found that there has also been substantial escalation in real (inflation-adjusted) nonfuel operating costs. It is important to determine the factors contributing to the escalation in operating costs, not only to understand what has occurred but also to gain insights about future trends in operating costs. There are two types of nonfuel operating costs. The first is routine operating and maintenance expenditures (O and M costs), and the second is large postoperational capital expenditures, or what is typically called ''capital additions.'' O and M costs consist mainly of expenditures on labor, and according to one recently completed study, the majoriy of employees at a nuclear power plant perform maintenance activities. It is generally thought that capital additions costs consist of large maintenance expenditures needed to keep the plants operational, and to make plant modifications (backfits) required by the Nuclear Regulatory Commission (NRC). Many discussions of nuclear power plant operating costs have not considered these capital additions costs, and a major finding of the present study is that these costs are substantial. The objective of this study was to determine why nonfuel operating costs have increased over the past decade. The statistical analysis examined a number of factors that have influenced the escalation in real nonfuel operating costs and these are discussed in this report. 4 figs, 19 tabs
Basic elements of computational statistics
Härdle, Wolfgang Karl; Okhrin, Yarema
2017-01-01
This textbook on computational statistics presents tools and concepts of univariate and multivariate statistical data analysis with a strong focus on applications and implementations in the statistical software R. It covers mathematical, statistical as well as programming problems in computational statistics and contains a wide variety of practical examples. In addition to the numerous R sniplets presented in the text, all computer programs (quantlets) and data sets to the book are available on GitHub and referred to in the book. This enables the reader to fully reproduce as well as modify and adjust all examples to their needs. The book is intended for advanced undergraduate and first-year graduate students as well as for data analysts new to the job who would like a tour of the various statistical tools in a data analysis workshop. The experienced reader with a good knowledge of statistics and programming might skip some sections on univariate models and enjoy the various mathematical roots of multivariate ...
The new statistics: why and how.
Cumming, Geoff
2014-01-01
We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial. This article explains why the new statistics are important and offers guidance for their use. It describes an eight-step new-statistics strategy for research with integrity, which starts with formulation of research questions in estimation terms, has no place for NHST, and is aimed at building a cumulative quantitative discipline.
Are expert systems needed on the power exchange?
International Nuclear Information System (INIS)
Nilsson, David.
1991-09-01
The goal of this masters thesis is to investigate whether it is possible to improve the activities at the power production unit (PDP) at Sydkraft with the help of expert systems. Another goal is that the produced result should work as an education material for the production operators. Therefore the work contains a detailed description of the power exchange and the factors that influence it, in addition to a description of the work tasks of the operators. The conclusions are, first, that an embedded critiquing system would enable a quicker training of new operators, while also giving the more experienced operators feedback on their decisions. Secondly, the water storage operation should be documented, to create a basis for the development of an expert system, and to create more material for education. The thesis also suggests a new model for the development of expert systems in areas where it is not known whether such a solution is possible. (au)
statistical analysis of wind speed for electrical power generation
African Journals Online (AJOL)
HOD
In order to predict and model the potential of any site, ... gamma, and Raleigh distributions for 8 locations in. Nigeria. ... probability density function is used to model the average power in ... mathematical expression of the Weibull distribution is.
An Update on Statistical Boosting in Biomedicine.
Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf
2017-01-01
Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.
Introductory statistics for engineering experimentation
Nelson, Peter R; Coffin, Marie
2003-01-01
The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...
Considerations on the need for electricity storage requirements: Power versus energy
International Nuclear Information System (INIS)
Belderbos, Andreas; Virag, Ana; D’haeseleer, William; Delarue, Erik
2017-01-01
Highlights: • General storage principles are analyzed. • Storage units have different limitations (power versus energy). • Storage power and energy are required, dependent on residual profile. • Relationship between residual profile and optimal storage portfolio is derived. • Broadly applicable rules regarding optimal storage investments are presented. - Abstract: Different storage technologies enable an increasing share of variable renewable generation in the electricity system by reducing the temporal mismatch between generation and demand. Two storage ratings are essential to time-shift delivery of electricity to loads: electric power, or instantaneous electricity flow [W], and electric energy, or power integrated over time [Wh]. An optimal storage portfolio is likely composed of multiple technologies, each having specific power and energy ratings. This paper derives and explains the link between the shape of the time-varying demand and generation profiles and the amount of desirably installed storage capacity, both energy and power. An analysis is performed for individual storage technologies first, showing a link between the necessary power and energy capacity and the demand and generation profile. Then combinations of storage technologies are analyzed to reveal their mutual interaction in a storage portfolio. Results show an increase in desirability for storage technologies with low cost power ratings when the mismatch between generation and demand occurs in daily to weekly cycles. Storage technologies with low cost energy ratings are preferred when this mismatch occurs in monthly to seasonal cycles. The findings of this work can help energy system planners and policy makers to explain results from generation expansion planning studies and to isolate the storage benefits accountable to temporal arbitrage in broader electricity storage studies.
statistical analysis of wind speed for electrical power generation
African Journals Online (AJOL)
HOD
sites are suitable for the generation of electrical energy. Also, the results ... Nigerian Journal of Technology (NIJOTECH). Vol. 36, No. ... parameter in the wind-power generation system. ..... [3] A. Zaharim, A. M Razali, R. Z Abidin, and K Sopian,.
A κ-generalized statistical mechanics approach to income analysis
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
A κ-generalized statistical mechanics approach to income analysis
International Nuclear Information System (INIS)
Clementi, F; Gallegati, M; Kaniadakis, G
2009-01-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful
International statistical bulletin, miner- energetic 1998-2003
International Nuclear Information System (INIS)
2004-12-01
It contains information of the sectors of mines and energy of Colombia, countries of Latin America and countries of the world, it includes statistical in gas, mining, electric power and primary energy
R statistical application development by example : beginner's guide
Tattar, Narayanachart Prabhanjan
2013-01-01
Full of screenshots and examples, this Beginner's Guide by Example will teach you practically everything you need to know about R statistical application development from scratch. You will begin learning the first concepts of statistics in R which is vital in this fast paced era and it is also a bargain as you do not need to do a preliminary course on the subject.
Directions for new developments on statistical design and analysis of small population group trials.
Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel
2016-06-14
Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small
CCS Retrofit: Analysis of the Global Installed Power Plant Fleet
Energy Technology Data Exchange (ETDEWEB)
NONE
2012-07-01
Electricity generation from coal is still growing rapidly and energy scenarios from the IEA expect a possible increase from today’s 1 600 GW of coal-fired power plants to over 2 600 GW until 2035. This trend will increase the lock-in of carbon intensive electricity sources, while IEA assessments show that two-thirds of total abatement from all sectors should come from the power sector alone to support a least-cost abatement strategy. Since coal-fired power plants have a fairly long lifetime, and in order to meet climate constraints, there is a need either to apply CCS retrofit to some of today’s installed coal-fired power plants once the technology becomes available. Another option would be to retire some plants before the end of their lifetime. This working paper discusses criteria relevant to differentiating between the technical, cost-effective and realistic potential for CCS retrofit. The paper then discusses today’s coal-fired power plant fleet from a statistical perspective, by looking at age, size and the expected performance of today’s plant across several countries. The working paper also highlights the growing demand for applying CCS retrofitting to the coal-fired power plant fleet of the future. In doing so this paper aims at emphasising the need for policy makers, innovators and power plant operators to quickly complete the development of the CCS technology and to identify key countries where retrofit applications will have the biggest extent and impact.
Using statistics to understand the environment
Cook, Penny A
2000-01-01
Using Statistics to Understand the Environment covers all the basic tests required for environmental practicals and projects and points the way to the more advanced techniques that may be needed in more complex research designs. Following an introduction to project design, the book covers methods to describe data, to examine differences between samples, and to identify relationships and associations between variables.Featuring: worked examples covering a wide range of environmental topics, drawings and icons, chapter summaries, a glossary of statistical terms and a further reading section, this book focuses on the needs of the researcher rather than on the mathematics behind the tests.
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Power generation needs and opportunities in Eastern Europe
International Nuclear Information System (INIS)
Gadomski, C.R.; Hon, M.
1990-01-01
This article examines the market for power generation and pollution control equipment in Eastern Europe. The topics of the article include financing equipment and services, financial and political incentives, capacity, environmental impacts, energy consumption and efficiency, energy prices, energy diversification, renewable energy opportunities, strategy for the market, and the example of Poland
SOCR: Statistics Online Computational Resource
Directory of Open Access Journals (Sweden)
Ivo D. Dinov
2006-10-01
Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.
Statistical Literacy: Data Tell a Story
Sole, Marla A.
2016-01-01
Every day, students collect, organize, and analyze data to make decisions. In this data-driven world, people need to assess how much trust they can place in summary statistics. The results of every survey and the safety of every drug that undergoes a clinical trial depend on the correct application of appropriate statistics. Recognizing the…
Energy Technology Data Exchange (ETDEWEB)
A.G. Crook Company
1993-04-01
This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.
Nuclear power: the need, the myths, the realities
International Nuclear Information System (INIS)
Shankar, Ravi
2016-01-01
Dr. H.J. Bhabha in his presidential address of 1"s"t International Conference on the peaceful uses of Atomic Energy held at Geneva in 1955 said 'In the broad view of human history it is possible to discern three great epochs. The first is marked by the emergence of the early civilizations in the valleys of Euphrates, the Indus and the Nile, the second by the industrial revolution, leading to the civilization in which we live, and the third by the discovery of atomic energy and the dawn of the atomic age, which we are just entering. Each epoch marks a change in the energy pattern of society'. The journey of development from the early man to today's technological man has been possible due to ever-increasing energy consumption. Today, we consume around 100 times more energy per capita as compared to the early man. Even after so many decades of independence, the big question we face today is that how to make our country totally self-dependent. For this, we will have to ensure security and self-sufficiency for every citizen of the country, in the areas of food, shelter, primary education, clean and sufficient potable water and high level health-care. For achieving all this, the availability of cheap and abundant energy is a must. Amongst others, nuclear power is a primary source of energy with lot of scope for development and is free of greenhouse gas effect. Nuclear power therefore has its own place in any energy policy of India. Our uranium resources are modest but abundant resources of thorium are there in the country. The three stage Indian nuclear power programme is essentially based on this fact
Statistics and finance an introduction
Ruppert, David
2004-01-01
This textbook emphasizes the applications of statistics and probability to finance. Students are assumed to have had a prior course in statistics, but no background in finance or economics. The basics of probability and statistics are reviewed and more advanced topics in statistics, such as regression, ARMA and GARCH models, the bootstrap, and nonparametric regression using splines, are introduced as needed. The book covers the classical methods of finance such as portfolio theory, CAPM, and the Black-Scholes formula, and it introduces the somewhat newer area of behavioral finance. Applications and use of MATLAB and SAS software are stressed. The book will serve as a text in courses aimed at advanced undergraduates and masters students in statistics, engineering, and applied mathematics as well as quantitatively oriented MBA students. Those in the finance industry wishing to know more statistics could also use it for self-study. David Ruppert is the Andrew Schultz, Jr. Professor of Engineering, School of Oper...
Impact of Wind Power Generation on European Cross-Border Power Flows
DEFF Research Database (Denmark)
Zugno, Marco; Pinson, Pierre; Madsen, Henrik
2013-01-01
analysis is employed in order to reduce the problem dimension. Then, nonlinear relationships between forecast wind power production as well as spot price in Germany, by far the largest wind power producer in Europe, and power flows are modeled using local polynomial regression. We find that both forecast...... wind power production and spot price in Germany have substantial nonlinear effects on power transmission on a European scale.......A statistical analysis is performed in order to investigate the relationship between wind power production and cross-border power transmission in Europe. A dataset including physical hourly cross-border power exchanges between European countries as dependent variables is used. Principal component...
A nonparametric spatial scan statistic for continuous data.
Jung, Inkyung; Cho, Ho Jin
2015-10-20
Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.
Green, Jeffrey J.; Stone, Courtenay C.; Zegeye, Abera; Charles, Thomas A.
2009-01-01
Because statistical analysis requires the ability to use mathematics, students typically are required to take one or more prerequisite math courses prior to enrolling in the business statistics course. Despite these math prerequisites, however, many students find it difficult to learn business statistics. In this study, we use an ordered probit…
An Update on Statistical Boosting in Biomedicine
Directory of Open Access Journals (Sweden)
Andreas Mayr
2017-01-01
Full Text Available Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting. In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.
Study on reactor power change and ambiguous control of third Qinshan Nuclear Power Plant
International Nuclear Information System (INIS)
Wang Gongzhan
2006-01-01
The phenomenon of the average power reduction during long term full power operating in Third Qinshan nuclear power plant is analyzed . According to the basic conclusions of reactor power fluctuating derived by probability statistic and calculation the corresponding ambiguous control project is proposed. The operating performance could be achieved by the present controlling project is predicted additionally. (authors)
Applied multivariate statistics with R
Zelterman, Daniel
2015-01-01
This book brings the power of multivariate statistics to graduate-level practitioners, making these analytical methods accessible without lengthy mathematical derivations. Using the open source, shareware program R, Professor Zelterman demonstrates the process and outcomes for a wide array of multivariate statistical applications. Chapters cover graphical displays, linear algebra, univariate, bivariate and multivariate normal distributions, factor methods, linear regression, discrimination and classification, clustering, time series models, and additional methods. Zelterman uses practical examples from diverse disciplines to welcome readers from a variety of academic specialties. Those with backgrounds in statistics will learn new methods while they review more familiar topics. Chapters include exercises, real data sets, and R implementations. The data are interesting, real-world topics, particularly from health and biology-related contexts. As an example of the approach, the text examines a sample from the B...
Chains, Shops and Networks: Official Statistics and the Creation of Public Value
Directory of Open Access Journals (Sweden)
Asle Rolland
2015-06-01
Full Text Available The paper concerns offi cial statistics, particularly as produced by the NSIs. Their contribution to the society is considered well captured by the concept of public value. Official statistics create value for the democracy as foundation for evidence-based politics. Democracies and autocracies alike need statistics to govern the public. Unique for the democracy is the need of statistics to govern the governors, for which the independence of the NSI is crucial. Three ways of creating public value are the value chain, the value shop and the value network. The chain is appropriate for the production, the shop for the interpretation and the network for the dissemination of statistics. Automation reduces the need to rely on the value chain as core business model. Thereto automation increases the statistical output, which in turn increases the need of shop and network activities. Replacing the chain with the shop as core model will elevate the NSIs from commodity producers to a processing industry.
International Nuclear Information System (INIS)
Murarka, I.P.; Policastro, A.J.; Ferrante, J.G.; Daniels, E.W.; Marmer, G.J.
1976-11-01
An analysis of the fuel data gathered during the monitoring programs at seven nuclear power plant sites show no significant ecological impacts. This conclusion is within the constraints of the quality of available data from these sites. Current monitoring programs are, however, not designed to meet the needs for statistical analysis and, consequently, the monitoring data are often ill-suited for modern statistical procedures. Recommendations are proposed for revising monitoring schemes so that more precise conclusions can be made from fewer field measurements
GeoGebra for Mathematical Statistics
Hewson, Paul
2009-01-01
The GeoGebra software is attracting a lot of interest in the mathematical community, consequently there is a wide range of experience and resources to help use this application. This article briefly outlines how GeoGebra will be of great value in statistical education. The release of GeoGebra is an excellent example of the power of free software…
Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A
2012-03-15
To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright Â© 2012 Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
International Nuclear Information System (INIS)
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t
2012-01-01
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
A novel statistic for genome-wide interaction analysis.
Directory of Open Access Journals (Sweden)
Xuesen Wu
2010-09-01
Full Text Available Although great progress in genome-wide association studies (GWAS has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked. The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001
Fuel rod design by statistical methods for MOX fuel
International Nuclear Information System (INIS)
Heins, L.; Landskron, H.
2000-01-01
Statistical methods in fuel rod design have received more and more attention during the last years. One of different possible ways to use statistical methods in fuel rod design can be described as follows: Monte Carlo calculations are performed using the fuel rod code CARO. For each run with CARO, the set of input data is modified: parameters describing the design of the fuel rod (geometrical data, density etc.) and modeling parameters are randomly selected according to their individual distributions. Power histories are varied systematically in a way that each power history of the relevant core management calculation is represented in the Monte Carlo calculations with equal frequency. The frequency distributions of the results as rod internal pressure and cladding strain which are generated by the Monte Carlo calculation are evaluated and compared with the design criteria. Up to now, this methodology has been applied to licensing calculations for PWRs and BWRs, UO 2 and MOX fuel, in 3 countries. Especially for the insertion of MOX fuel resulting in power histories with relatively high linear heat generation rates at higher burnup, the statistical methodology is an appropriate approach to demonstrate the compliance of licensing requirements. (author)
Statistical methods for astronomical data analysis
Chattopadhyay, Asis Kumar
2014-01-01
This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...
Statistical Methods for Environmental Pollution Monitoring
Energy Technology Data Exchange (ETDEWEB)
Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
A Statistical Toolkit for Data Analysis
International Nuclear Information System (INIS)
Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.
2006-01-01
The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation
Directory of Open Access Journals (Sweden)
Turnbull Arran K
2012-08-01
Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.
International Nuclear Information System (INIS)
Weathers, J.B.; Luck, R.; Weathers, J.W.
2009-01-01
The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.