The Brazilian relief, predominantly composed by small mountains and plateaus, contributed to formation of rivers with high amount of falls. With exception to North-eastern Brazil, the climate of this country are rainy, which contributes to maintain water flows high. These elements are essential to a high hydroelectric potential, contributing to the choice of hydroelectric power plants as the main technology of electricity generation in Brazil. Though this is a renewable source, whose utilized resource is free, dams must to be established which generates a high environmental and social impact. The objective of this study is to evaluate the impact caused by these dams through the use of environmental indexes. These indexes are ratio formed by installed power with dam area of a hydro power plant, and ratio formed by firm power with this dam area. In this study, the greatest media values were found in South, Southeast, and Northeast regions respectively, and the smallest media values were found in North and Mid-West regions, respectively. The greatest encountered media indexes were also found in dams established in the 1950s. In the last six decades, the smallest indexes were registered by dams established in the 1980s. These indexes could be utilized as important instruments for environmental impact assessments, and could enable a dam to be established that depletes an ecosystem as less as possible. (author)
Ladoni, Moslem; Kravchenko, Sasha
2014-05-01
Conservational agricultural managements have a potential to increase soil organic carbon sequestration. However, due to typically slow response of soil organic C to management and due to its large spatial variability many researchers find themselves failing to detect statistically significant management effects on soil organic carbon in their studies. One solution that has been commonly applied is to use active fractions of soil organic C for treatment comparisons. Active pools of soil organic C have been shown to respond to management changes faster than total C; however, it is possible that larger variability associated with these pools can make their use for treatment comparisons more difficult. The objectives of this study are to assess the variability of total C and C active pools and then to use power analysis to investigate the probability of detecting significant differences among the treatments for total C and for different active pools of C. We also explored the benefit of applying additional soil and landscape data as covariates to explain some of the variability and to enhance the statistical power for different pools of C. We collected 66 soil from 10 agricultural fields under three different management treatments, namely corn-soybean-wheat rotation systems with 1) conventional chemical inputs, 2) low chemical inputs with cover crops and 3) organic management with cover crops. The cores were analyzed for total organic carbon (TOC) and for two active C pool characteristics, such as particulate organic carbon (POC) and short-term mineralizable carbon (SMC). In addition, for each core we determined the values of potential covariates including soil particle size distribution, bulk density and topographical terrain attributes. Power analysis was conducted using the estimates of variances from the obtained data and a series of hypothesized management effects. The range of considered hypothesized effects consisted of 10-100% increases under low-input, 10
Frantál Bohumil
2016-03-01
Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.
The frost in February increased the power demand in Finland significantly. The total power consumption in Finland during January-February 2001 was about 4% higher than a year before. In January 2001 the average temperature in Finland was only about - 4 deg C, which is nearly 2 degrees higher than in 2000 and about 6 degrees higher than long term average. Power demand in January was slightly less than 7.9 TWh, being about 0.5% less than in 2000. The power consumption in Finland during the past 12 months exceeded 79.3 TWh, which is less than 2% higher than during the previous 12 months. In February 2001 the average temperature was - 10 deg C, which was about 5 degrees lower than in February 2000. Because of this the power consumption in February 2001 increased by 5%. Power consumption in February was 7.5 TWh. The maximum hourly output of power plants in Finland was 13310 MW. Power consumption of Finnish households in February 2001 was about 10% higher than in February 2000, and in industry the increase was nearly zero. The utilization rate in forest industry in February 2001 decreased from the value of February 2000 by 5%, being only about 89%. The power consumption of the past 12 months (Feb. 2000 - Feb. 2001) was 79.6 TWh. Generation of hydroelectric power in Finland during January - February 2001 was 10% higher than a year before. The generation of hydroelectric power in Jan. - Feb. 2001 was nearly 2.7 TWh, corresponding to 17% of the power demand in Finland. The output of hydroelectric power in Finland during the past 12 months was 14.7 TWh. The increase from the previous 12 months was 17% corresponding to over 18% of the power demand in Finland. Wind power generation in Jan. - Feb. 2001 was exceeded slightly 10 GWh, while in 2000 the corresponding output was 20 GWh. The degree of utilization of Finnish nuclear power plants in Jan. - Feb. 2001 was high. The output of these plants was 3.8 TWh, being about 1% less than in Jan. - Feb. 2000. The main cause for the
This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...
NONSTRUCTURAL AND STATISTICAL NONPARAMETRIC MARKET POWER TESTS: AN EMPIRICAL INVESTIGATION
Noelke, Corinna M.; Raper, Kellie Curry
1999-01-01
We use Monte Carlo experiments to assess the accuracy of two nonstructural and two statistical nonparametric market power tests. We implement these monopoly and monopsony market power tests using data from ten known market structures. The objective is to determine which test is most able to distinguish between market structures. The statistical nonparametric market power tests appear to be promising.
Assessing statistical significance of periodogram peaks
Baluev, Roman V.
2007-01-01
The least-squares (or Lomb-Scargle) periodogram is a powerful tool which is used routinely in many branches of astronomy to search for periodicities in observational data. The problem of assessing statistical significance of candidate periodicities for different periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. They include an upper limit to the false alarm probability (or a lower limit to the significan...
Voet, van der H.; Goedhart, P.W.
2015-01-01
Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation stu
Assessing statistical significance of periodogram peaks
Baluev, Roman V
2007-01-01
The least-squares (or Lomb-Scargle) periodogram is a powerful tool which is used routinely in many branches of astronomy to search for periodicities in observational data. The problem of assessing statistical significance of candidate periodicities for different periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. They include an upper limit to the false alarm probability (or a lower limit to the significance). These estimations are tested numerically in order to establish regions of their practical applicability.
Assessing the statistical significance of periodogram peaks
Baluev, R. V.
2008-04-01
The least-squares (or Lomb-Scargle) periodogram is a powerful tool that is routinely used in many branches of astronomy to search for periodicities in observational data. The problem of assessing the statistical significance of candidate periodicities for a number of periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. These include an upper limit to the false alarm probability (or a lower limit to the significance). The estimations are tested numerically in order to establish regions of their practical applicability.
Calculating statistical power in Mendelian randomization studies.
Brion, Marie-Jo A; Shakhbazov, Konstantin; Visscher, Peter M
2013-10-01
In Mendelian randomization (MR) studies, where genetic variants are used as proxy measures for an exposure trait of interest, obtaining adequate statistical power is frequently a concern due to the small amount of variation in a phenotypic trait that is typically explained by genetic variants. A range of power estimates based on simulations and specific parameters for two-stage least squares (2SLS) MR analyses based on continuous variables has previously been published. However there are presently no specific equations or software tools one can implement for calculating power of a given MR study. Using asymptotic theory, we show that in the case of continuous variables and a single instrument, for example a single-nucleotide polymorphism (SNP) or multiple SNP predictor, statistical power for a fixed sample size is a function of two parameters: the proportion of variation in the exposure variable explained by the genetic predictor and the true causal association between the exposure and outcome variable. We demonstrate that power for 2SLS MR can be derived using the non-centrality parameter (NCP) of the statistical test that is employed to test whether the 2SLS regression coefficient is zero. We show that the previously published power estimates from simulations can be represented theoretically using this NCP-based approach, with similar estimates observed when the simulation-based estimates are compared with our NCP-based approach. General equations for calculating statistical power for 2SLS MR using the NCP are provided in this note, and we implement the calculations in a web-based application. PMID:24159078
Statistics review 11: Assessing risk
Bewick, Viv; Cheek, Liz; Ball, Jonathan
2004-01-01
Relative risk and odds ratio have been introduced in earlier reviews (see Statistics reviews 3, 6 and 8). This review describes the calculation and interpretation of their confidence intervals. The different circumstances in which the use of either the relative risk or odds ratio is appropriate and their relative merits are discussed. A method of measuring the impact of exposure to a risk factor is introduced. Measures of the success of a treatment using data from clinical trials are also con...
Power performance assessment. Final report
In the increasingly commercialised wind power marketplace, the lack of precise assessment methods for the output of an investment is becoming a barrier for wider penetration of wind power. Thus, addressing this problem, the overall objectives of the project are to reduce the financial risk in investment in wind power projects by significantly improving the power performance assessment methods. Ultimately, if this objective is successfully met, the project may also result in improved tuning of the individual wind turbines and in optimisation methods for wind farm operation. The immediate, measurable objectives of the project are: To prepare a review of existing contractual aspects of power performance verification procedures of wind farms; to provide information on production sensitivity to specific terrain characteristics and wind turbine parameters by analyses of a larger number of wind farm power performance data available to the proposers; to improve the understanding of the physical parameters connected to power performance in complex environment by comparing real-life wind farm power performance data with 3D computational flow models and 3D-turbulence wind turbine models; to develop the statistical framework including uncertainty analysis for power performance assessment in complex environments; and to propose one or more procedures for power performance evaluation of wind power plants in complex environments to be applied in contractual agreements between purchasers and manufacturers on production warranties. Although the focus in this project is on power performance assessment the possible results will also be of benefit to energy yield forecasting, since the two tasks are strongly related. (au) JOULE III. 66 refs.; In Co-operation Renewable Energy System Ltd. (GB); Centre for Renewable Energy (GR); Aeronautic Research Centre (SE); National Engineering Lab. (GB); Public Power Cooperation (GR)
Availability statistics for thermal power plants
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)
Availability statistics for thermal power plants
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1991 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping af units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power'', September 1977, have been applied. (au)
Availability statistics for thermal power plants
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1988 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)
The electrical power systems are exposed to different types of power quality disturbances problems. Assessment of power quality is necessary for maintaining accurate operation of sensitive equipment's especially for nuclear installations, it also ensures that unnecessary energy losses in a power system are kept at a minimum which lead to more profits. With advanced in technology growing of industrial / commercial facilities in many region. Power quality problems have been a major concern among engineers; particularly in an industrial environment, where there are many large-scale type of equipment. Thus, it would be useful to investigate and mitigate the power quality problems. Assessment of Power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. The choice of monitoring equipment in a survey is also important to ascertain a solution to these power quality problems. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards); then, if problems exist, recommendation of mitigation techniques must be considered. The main objective of the present work is to investigate and mitigate of power quality problems in nuclear installations. Normally electrical power is supplied to the installations via two sources to keep good reliability. Each source is designed to carry the full load. The Assessment of power quality was performed at the nuclear installations for both sources at different operation conditions. The thesis begins with a discussion of power quality definitions and the results of previous studies in power quality monitoring. The assessment determines that one source of electricity was deemed to have relatively good power quality; there were several disturbances, which exceeded the thresholds. Among of them are fifth harmonic, voltage swell, overvoltage and flicker. While the second
Availability statistics for thermal power plants
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)
Power and environmental assessment
Cashmore, Matthew Asa; Richardson, Tim
2013-01-01
The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently...
Cappon, Gregg D; Bowman, Christopher J; Hurtt, Mark E; Grantham, Lonnie E
2012-10-01
An important aspect of the enhanced pre- and postnatal developmental (ePPND) toxicity study in nonhuman primates (NHP) is that it combines in utero and postnatal assessments in a single study. However, it is unclear if NHP ePPND studies are suitable to perform all of the evaluations incorporated into rodent PPND studies. To understand the value of including cognitive assessment in a NHP ePPND toxicity study, we performed a power analysis of object discrimination reversal task data using a modified Wisconsin General Testing Apparatus (ODR-WGTA) from two NHP ePPND studies. ODR-WGTA endpoints evaluated were days to learning and to first reversal, and number of reversals. With α = 0.05 and a one-sided t-test, a sample of seven provided 80% power to predict a 100% increase in all three of the ODR-WGTA endpoints; a sample of 25 provided 80% power to predict a 50% increase. Similar power analyses were performed with data from the Cincinnati Water Maze (CWM) and passive avoidance tests from three rat PPND toxicity studies. Groups of 5 and 15 in the CWM and passive avoidance test, respectively, provided 80% power to detect a 100% change. While the power of the CWM is not far superior to the NHP ODR-WGTA, a clear advantage is the routine use of larger sample size, with a group of 20 rats the CWM provides ~90% power to detect a 50% change. Due to the limitations on the number of animals, the ODR-WGTA may not be suitable for assessing cognitive impairment in NHP ePPND studies. PMID:22930561
Availability statistics for thermal power plants 1992
Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power'', September 1977, have been applied. (au)
Statistical Performances of Resistive Active Power Splitter
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Evaluating and Reporting Statistical Power in Counseling Research
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Practical Uses of Statistical Power in Business Research Studies.
Markowski, Edward P.; Markowski, Carol A.
1999-01-01
Proposes the use of statistical power subsequent to the results of hypothesis testing in business research. Describes how posttest use of power might be integrated into business statistics courses. (SK)
Statistical aspects of fish stock assessment
Berg, Casper Willestofte
Fish stock assessments are conducted for two main purposes: 1) To estimate past and present fish abundances and their commercial exploitation rates. 2) To predict the consequences of different management strategies in order to ensure a sustainable fishery in the future. This thesis concerns...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods for...... stock assessment by application of state-of-the-art statistical methodology. The main contributions are presented in the form of six research papers. The major part of the thesis deals with age-structured assessment models, which is the most common approach. Conversion from length to age distributions...
PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
When Mathematics and Statistics Collide in Assessment Tasks
Bargagliotti, Anna; Groth, Randall
2016-01-01
Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.
Statistical methods for assessment of blend homogeneity
Madsen, Camilla
2002-01-01
In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials as...... shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...
The Role of Atmospheric Measurements in Wind Power Statistical Models
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.
2015-12-01
The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.
Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models
Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles
2012-01-01
This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…
The Power and Robustness of Maximum LOD Score Statistics
YOO, Y. J.; MENDELL, N.R.
2008-01-01
The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value.
Assessment Methods in Statistical Education An International Perspective
Bidgood, Penelope; Jolliffe, Flavia
2010-01-01
This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.
Editors note: The uncorrupted statistical power
Jean Descôteaux
2007-09-01
Full Text Available In 1999, Wilkinson and the Task Force on Statistical Inference published a number of recommendations concerning testing related issues including, most importantly, statistical power. These recommendations are discussed prior to the presentation of the structure and the various articles of this special issue on statistical power. The contents of these articles will most certainly prove quite useful to those wishing to follow the Task Forces recommendations.
Data management and statistical analysis for environmental assessment
Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities
Statistical tests for power-law cross-correlated processes.
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρ(DCCA)(T,n), where T is the total length of the time series and n the window size. For ρ(DCCA)(T,n), we numerically calculated the Cauchy inequality -1 ≤ ρ(DCCA)(T,n) ≤ 1. Here we derive -1 ≤ ρ DCCA)(T,n) ≤ 1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρ(DCCA) within which the cross-correlations become statistically significant. For overlapping windows we numerically determine-and for nonoverlapping windows we derive--that the standard deviation of ρ(DCCA)(T,n) tends with increasing T to 1/T. Using ρ(DCCA)(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series. PMID:22304166
Statistical tests for power-law cross-correlated processes
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
Previous European guidance for environmental risk assessment of genetically-modified plants emphasized the concepts of statistical power but provided no explicit requirements for the provision of statistical power analyses. Similarly, whilst the need for good experimental designs was stressed, no m...
IAEA releases nuclear power statistics for 2002
Full text: A total of 441 nuclear power plants were operating around the world at the end of 2002, according to data reported to the IAEA's Power Reactor Information System (PRIS). World nuclear electricity generation was about 2574 TWh. Also during 2002, six nuclear power plants representing 5013 MW(e) were connected to the grid, four in China, one in the Czech Republic and one in the Republic of Korea. In addition, construction of seven new nuclear reactors commenced in 2002 - six in India and one in the Democratic People's Republic of Korea, bringing the total number of nuclear reactors reported as being under construction to 32. Four nuclear reactors were shut down in 2002, two in Bulgaria and two in the United Kingdom. The ten countries with the highest reliance on nuclear power in 2002 were: Lithuania, 80.1 per cent; France, 78 per cent; Slovakia, 65.4 per cent; Belgium 57.3 per cent; Bulgaria, 47.3 per cent; Ukraine, 45.7 per cent; Sweden, 45.7 per cent; Slovenia, 40.7 per cent; Armenia, 40.5 per cent; Switzerland 39.5 per cent. During 2002, six new nuclear power plants were connected to the electricity grid: Qinshan 2-1, a 610 MW(e) pressurized water reactor (PWR) in China; Qinshan 3-1, a 655 MW(e) pressurized heavy water reactor (PHWR) in China; Lingao 1, a 938 MW(e) PWR in China; Lingao 2, a 938 MW(e) PWR in China; Temelin 2, a 912 MW(e) water-cooled and water- moderated reactor (WWER) in Czech Republic; Yonggwang 6, a 950 MW(e) PWR in Republic of Korea. Also, in 2002 construction started on seven plants: Kaiga 3, a 202 MW(e) PHWR in India; Kaiga 4, a 202 MW(e) PHWR in India; Rajasthan 5, a 202 MW(e) PHWR in India; Rajasthan 6, a 202 MW(e) PHWR in India Kudankulam 1, a 905 MW(e) WWER in India; Kudankulam 2, a 905 MW(e) WWER in India; LWR - Project Unit 1, a 1040 MW(e) PWR in Dem. P. R. Korea. A table showing nuclear power reactors in operation and under construction at 31 Dec. 2002 is available. (IAEA)
Chan, Shiau Wei; Ismail, Zaleha
2014-01-01
The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…
Statistical modelling of mitochondrial power supply.
James, A T; Wiskich, J T; Conyers, R A
1989-01-01
By experiment and theory, formulae are derived to calculate the response of mitochondrial power supply, in flux and potential, to an ATP consuming enzyme load, incorporating effects of varying amounts of (i) enzyme, (ii) total circulating adenylate, and (iii) inhibition of the ATP/ADP translocase. The formulae, which apply between about 20% and 80% of maximum respiration, are the same as for the current and voltage of an electrical circuit in which a battery with potential, linear in the logarithm of the total adenylate, charges another battery whose opposing potential is also linear in the same logarithm, through three resistances. These resistances produce loss of potential due to dis-equilibrium of (i) intramitochondrial oxidative phosphorylation, (ii) the ATP/ADP translocase, and (iii) the ATP-consuming enzyme load. The model is represented geometrically by the following configuration: when potential is plotted against flux, the points lie on two pencils of lines each concurrent at zero respiration, the two pencils describing the respective characteristics of the mitochondrion and enzyme. Control coefficients and elasticities are calculated from the formulae. PMID:2708917
Using Tree Diagrams as an Assessment Tool in Statistics Education
Yin, Yue
2012-01-01
This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…
WHAT IS THE MAJOR POWER LINKING STATISTICS & DATA MINING ?
M.E. Abd El-Monsef
2013-11-01
Full Text Available In the recent years, numerous scientific research studies which stand for the intersecting disciplines between statistics and data mining (DM are obtained [17, 18, 19, 24, 27, 30, 35]. This paper is devoted to answer the titled suggested question which is based on five reply trends, the 1st trend based on an updated historical vision for each of statistics and DM. The 2nd trend is concerned with modern theoretical significant reply between statistics and DM. The major power linking statistics and DM is established in the 3rd trend. Lastly, the 4th trend represents a significant comparison between statistics & DM. A conceptual classification about Statistical Data Mining (SDM process in Egypt will be represented in the 5 th reply trend. Finally, the conclusion and the future work are represented.
Replication unreliability in psychology: elusive phenomena or elusive statistical power?
Patrizio E Tressoldi
2012-07-01
Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.
Statistical method for scientific projects risk assessment
Бедрій, Дмитро Іванович
2013-01-01
This article discusses the use of statistical methods for risk evaluation of the scientific institutions activity in the public sector of the Ukrainian economy in the process of planning and execution of scientific projects, some of the results of our research in this area are presented. The main objective of the study is to determine the possibility of using the statistical method in the process of evaluation of the research projects risks. The use of risk evaluation methods allows the manag...
New Dynamical-Statistical Techniques for Wind Power Prediction
Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.
2012-04-01
The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.
Statistical analysis of power ramp PCI test data
Data from power ramp tests of reference standard fuel rods and PCI resistant fuel designs were analyzed statistically using the STATPAC computer program. Effects of design variations in the reference fuel are described. The significantly improved performance of zirconium liner fuel over copper barrier fuel and reference fuel is also shown. (author)
Multivariate statistical assessment of coal properties
Klika, Z.; Serenčíšová, J.; Kožušníková, Alena; Kolomazník, I.; Študentová, S.; Vontorová, J.
2014-01-01
Roč. 128, č. 128 (2014), s. 119-127. ISSN 0378-3820 R&D Projects: GA MŠk ED2.1.00/03.0082 Institutional support: RVO:68145535 Keywords : coal properties * structural,chemical and petrographical properties * multivariate statistics Subject RIV: DH - Mining, incl. Coal Mining Impact factor: 3.352, year: 2014 http://dx.doi.org/10.1016/j.fuproc.2014.06.029
Power Curve Modeling in Complex Terrain Using Statistical Models
Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.
2014-12-01
Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-06-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
Power laws statistics of cliff failures, scaling and percolation
Baldassarri, Andrea
2014-01-01
The size of large cliff failures may be described in several ways, for instance considering the horizontal eroded area at the cliff top and the maximum local retreat of the coastline. Field studies suggest that, for large failures, the frequencies of these two quantities decrease as power laws of the respective magnitudes, defining two different decay exponents. Moreover, the horizontal area increases as a power law of the maximum local retreat, identifying a third exponent. Such observation suggests that the geometry of cliff failures are statistically similar for different magnitudes. Power laws are familiar in the physics of critical systems. The corresponding exponents satisfy precise relations and are proven to be universal features, common to very different systems. Following the approach typical of statistical physics, we propose a "scaling hypothesis" resulting in a relation between the three above exponents: there is a precise, mathematical relation between the distributions of magnitudes of erosion ...
Assessing photographer competence using face statistics
Greig, Darryl; Gao, Yuli
2010-02-01
The rapid growth of photo sharing websites has resulted in some new problems around the management of a large (and quickly increasing) number of photographers with different needs and usage characteristics. Despite significant advances in the field of computer vision, little has been done to leverage these technologies for photographer understanding and management, partly due to the high computational cost of extracting application-specific image features. Recently robust multi-view face detection technologies have been widely adopted by many photo sharing sites. This affords a limited but "standard" pre-computed set of face features to tackle these administrative problems in large scale settings. In this paper we present a principled statistical model to alleviate one such administrative task - the automatic analysis of photographer competency given only face detection results on a set of their photos. The model uses summary statistics to estimate the probability a given individual belongs to a population of high competence photographers over against a second population of lower competence photographers. Using this model, we have achieved high classification accuracy (respectively 84.3% and 90.9%) on two large image datasets. We discuss an application of this approach to assist in managing a photo-sharing website.
Effect size, confidence intervals and statistical power in psychological research.
Téllez A.
2015-07-01
Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.
"Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"
Konstantopoulos, Spyros
2009-01-01
Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…
Statistical reliability assessment of software-based systems
Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)
Statistical quality assessment of a fingerprint
Hwang, Kyungtae
2004-08-01
The quality of a fingerprint is essential to the performance of AFIS (Automatic Fingerprint Identification System). Such a quality may be classified by clarity and regularity of ridge-valley structures.1,2 One may calculate thickness of ridge and valley to measure the clarity and regularity. However, calculating a thickness is not feasible in a poor quality image, especially, severely damaged images that contain broken ridges (or valleys). In order to overcome such a difficulty, the proposed approach employs the statistical properties in a local block, which involve the mean and spread of the thickness of both ridge and valley. The mean value is used for determining whether a fingerprint is wet or dry. For example, the black pixels are dominant if a fingerprint is wet, the average thickness of ridge is larger than one of valley, and vice versa on a dry fingerprint. In addition, a standard deviation is used for determining severity of damage. In this study, the quality is divided into three categories based on two statistical properties mentioned above: wet, good, and dry. The number of low quality blocks is used to measure a global quality of fingerprint. In addition, a distribution of poor blocks is also measured using Euclidean distances between groups of poor blocks. With this scheme, locally condensed poor blocks decreases the overall quality of an image. Experimental results on the fingerprint images captured by optical devices as well as by a rolling method show the wet and dry parts of image were successfully captured. Enhancing an image by employing morphology techniques that modifying the detected poor quality blocks is illustrated in section 3. However, more work needs to be done on designing a scheme to incorporate the number of poor blocks and their distributions for a global quality.
Statistical analyses support power law distributions found in neuronal avalanches.
Andreas Klaus
Full Text Available The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii model parameter estimation to determine the specific exponent of the power law, and (iii comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect. This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.
Assessment of alternatives to correct inventory difference statistical treatment deficiencies
This document presents an analysis of alternatives to correct deficiencies in the statistical treatment of inventory differences in the NRC guidance documents and licensee practice. Pacific Northwest Laboratory's objective for this study was to assess alternatives developed by the NRC and a panel of safeguards statistical experts. Criteria were developed for the evaluation and the assessment was made considering the criteria. The results of this assessment are PNL recommendations, which are intended to provide NRC decision makers with a logical and statistically sound basis for correcting the deficiencies
Cross-Cultural Instrument Translation: Assessment, Translation, and Statistical Applications
Mason, Teresa Crowe
2005-01-01
This article has four major sections: (a) general issues of assessment; (b) assessment of ethnic-group members, including those who are deaf; (c) translation of assessment tools, emphasizing translation into American Sign Language (ASL); and (d) statistical applications for translated instruments. The purpose of the article is to provide insight…
Self-assessed performance improves statistical fusion of image labels
Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)
2014-03-15
Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance
Self-assessed performance improves statistical fusion of image labels
Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance
Enrichment of statistical power for genome-wide association studies
Li, Meng; Liu, Xiaolei; Bradbury, Peter; Yu, Jianming; Zhang, Yuan-Ming; Todhunter, Rory J.; Buckler, Edward S; Zhang, Zhiwu
2014-01-01
Background The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most flexible and powerful for controlling population structure and individual unequal relatedness (kinship), the two common causes of spurious associations. The introduction of the compressed ...
Quantitative statistical methods for image quality assessment.
Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng
2013-01-01
Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148
Quality Assessment and Improvement Methods in Statistics – what Works?
Hans Viggo Sæbø
2014-12-01
Full Text Available Several methods for quality assessment and assurance in statistics have been developed in a European context. Data Quality Assessment Methods (DatQAM were considered in a Eurostat handbook in 2007. These methods comprise quality reports and indicators, measurement of process variables, user surveys, self-assessments, audits, labelling and certifi cation. The entry point for the paper is the development of systematic quality work in European statistics with regard to good practices such as those described in the DatQAM handbook. Assessment is one issue, following up recommendations and implementation of improvement actions another. This leads to a discussion on the eff ect of approaches and tools: Which work well, which have turned out to be more of a challenge, and why? Examples are mainly from Statistics Norway, but these are believed to be representative for several statistical institutes.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Asal, F. F.
2012-07-01
Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical
K.Fujiyama; T.Fujiwara; Y.Nakatani; K.Saito; A.Sakuma; Y.Akikuni; S.Hayashi; S.Matsumoto
2004-01-01
Statistical manipulation of material data was conducted for probabilistic life assessment or risk-based design and maintenance for high temperature components of power plants. To obtain the statistical distribution of material properties, dominant parameters affecting material properties are introduced into normalization of statistical variables. Those parameters are hardness, chemical composition, characteristic microstructural features and so on. Creep and fatigue properties are expressed by normalized parameters and the unified statistical distributions are obtained. These probability distribution functions show good coincidence statistically with the field database of steam turbine components. It was concluded that the unified statistical baseline approach is useful for the risk management of components in power plants.
GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis
V. Dehghanian
2012-01-01
Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.
Robustness of Spacing-based Power Divergence Statistics
Boček, Pavel
Praha : ÚTIA AVČR, v.v.i, 2011 - (Janžura, M.; Ivánek, J.). s. 23-23 [7th International Workshop on Data - Algorithms - Decision Making. 27.11.2011-29.11.2011, Mariánská] R&D Projects: GA MŠk 1M0572; GA ČR GAP202/10/0618 Institutional research plan: CEZ:AV0Z10750506 Keywords : alpha-divergence * goodness-of-fit Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2011/SI/bocek- robustness of spacing-based power divergence statistics.pdf
Demographic statistics pertaining to nuclear power reactor sites
Population statistics are presented for 145 nuclear power plant sites. Summary tables and figures are included that were developed to aid in the evaluation of trends and general patterns associated with the various parameters of interest, such as the proximity of nuclear plant sites to centers of population. The primary reason for publishing this information at this time is to provide a factual basis for use in discussions on the subject of reactor siting policy. The report is a revised and updated version of a draft report published in December 1977. Errors in the population data base have been corrected and new data tabulations added
HVDC power transmission technology assessment
Hauth, R.L.; Tatro, P.J.; Railing, B.D. [New England Power Service Co., Westborough, MA (United States); Johnson, B.K.; Stewart, J.R. [Power Technologies, Inc., Schenectady, NY (United States); Fink, J.L.
1997-04-01
The purpose of this study was to develop an assessment of the national utility system`s needs for electric transmission during the period 1995-2020 that could be met by future reduced-cost HVDC systems. The assessment was to include an economic evaluation of HVDC as a means for meeting those needs as well as a comparison with competing technologies such as ac transmission with and without Flexible AC Transmission System (FACTS) controllers. The role of force commutated dc converters was to be assumed where appropriate. The assessment begins by identifying the general needs for transmission in the U.S. in the context of a future deregulated power industry. The possible roles for direct current transmission are then postulated in terms of representative scenarios. A few of the scenarios are illustrated with the help of actual U.S. system examples. non-traditional applications as well as traditional applications such as long lines and asynchronous interconnections are discussed. The classical ``break-even distance`` concept for comparing HVDC and ac lines is used to assess the selected scenarios. The impact of reduced-cost converters is reflected in terms of the break-even distance. This report presents a comprehensive review of the functional benefits of HVDC transmission and updated cost data for both ac and dc system components. It also provides some provocative thoughts on how direct current transmission might be applied to better utilize and expand our nation`s increasingly stressed transmission assets.
Statistical analysis applied to safety culture self-assessment
Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)
Model output statistics applied to wind power prediction
Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)
1999-03-01
Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.
Statistical problems in the assessment of nuclear risks
Information on nuclear power plant risk assessment is presented concerning attitudinal problems; and methodological problems involving expert opinions, human error probabilities, nonindependent events, uncertainty analysis, and acceptable risk criteria
Toward improved statistical treatments of wind power forecast errors
Hart, E.; Jacobson, M. Z.
2011-12-01
The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.
Alternative Assessment in Higher Education: An Experience in Descriptive Statistics
Libman, Zipora
2010-01-01
Assessment-led reform is now one of the most widely favored strategies to promote higher standards of teaching, more powerful learning and more credible forms of public accountability. Within this context of change, higher education in many countries is increasingly subjected to demands to implement alternative assessment strategies that provide…
Improved power performance assessment methods
Frandsen, S.; Antoniou, I.; Dahlberg, J.A. [and others
1999-03-01
The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)
Prediction of lacking control power in power plants using statistical models
Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob
2007-01-01
Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance of...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...
Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows
Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco;
2012-01-01
In view of the increasing penetration of wind power in a number of power systems and markets worldwide, we discuss some of the impacts that wind energy may have on market quantities and cross-border power flows. These impacts are uncovered through statistical analyses of actual market and flow data...... in Europe. Due to the dimensionality and nonlinearity of these effects, the necessary concepts of dimension reduction using Principal Component Analysis (PCA), as well as nonlinear regression are described. Example application results are given for European cross-border flows, as well as for the...
Statistical analysis of occupational exposure in nuclear power plants
Occupational doses have wide variations from zero to high values on the logarithmic scale, according to the workers jobs. However as radiation control programmes constrain higher exposures more, the variation of higher doses changes from log to linear scale, while the structure of lower doses remain. In the paper we analyse the annual effective doses of workers in 3 nuclear power plants of Jaslovske Bohunice using various distribution models. The hybrid-lognormal description of the annual dose distribution makes it possible to assess also the annual collective doses below the adopted recording level. Two methods of analysing the 'lost' occupational collective doses are presented
Earthquake accelerogram simulation with statistical law of evolutionary power spectrum
ZHANG Cui-ran; CHEN Hou-qun; LI Min
2007-01-01
By using the technique for evolutionary power spectrum proposed by Nakayama and with reference to the Kameda formula, an evolutionary spectrum prediction model for given earthquake magnitude and distance is established based on the 80 near-source acceleration records at rock surface with large magnitude from the ground motion database of western U.S.. Then a new iteration method is developed for generation of random accelerograms non-stationary both in amplitude and frequency which are compatible with target evolutionary spectrum. The phase spectra of those simulated accelerograms are also non-stationary in time and frequency domains since the interaction between amplitude and phase angle has been considered during the generation. Furthermore, the sign of the phase spectrum increment is identified to accelerate the iteration. With the proposed statistical model for predicting evolutionary power spectra and the new method for generating compatible time history, the artificial random earthquake accelerograms non-stationary both in amplitude and frequency for certain magnitude and distance can be provided.
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO2-emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO2-emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
Efficient computation and statistical assessment of transfer entropy
Patrick eBoba
2015-03-01
Full Text Available The analysis of complex systems frequently poses the challenge to distinguish correlation from causation. Statistical physics hasinspired very promising approaches to search for correlations in time series; the transfer entropy in particular (Hlavackova-Schindler et al., 2007. Now, methods from computational statistics can quantitatively assign significance to such correlation measures. In this study, we propose and apply a procedure to statistically assess transfer entropies by one-sided tests. We introduce to null models of vanishing correlations for time series with memory.We implemented them in an OpenMP-based, parallelized C++ package for multi-core CPUs. Using template meta-programming, we enable a compromise between memory and run time efficiency.
Caveats for using statistical significance tests in research assessments
Schneider, Jesper Wiborg
2013-01-01
controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such...... tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators are...
R. Eric Heidel
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of t...
A New Statistic Approach towards Landslide Hazard Risk Assessment
George Gaprindashvili; Jianping Guo; Panisara Daorueang; Tian Xin; Pooyan Rahimy
2014-01-01
To quantitatively assess the landslide hazard in Khelvachauri, Georgia, the statistic method of hazard index was applied. A spatial database was constructed in Geographic Information System (GIS) including topographic data, geologic maps, land-use, and active landslide events (extracted from the landslide inventory). After that, causal factors of landslides (such as slope, aspect, lithology, geomorphology, land-use and soil depth) were produced to calculate the correspo...
Assessing Budget Support with Statistical Impact Evaluation: a Methodological Proposal
Elbers, Chris; Gunning, Jan Willem; de Hoop, Kobus
2007-01-01
Donor agencies and recipient governments want to assess the effectiveness of aid-supported sector policies. Unfortunately, existing methods for impact evaluation are designed for the evaluation of homogeneous interventions (‘projects’) where those with and without ‘treatment’ can be compared. The lack of a methodology for evaluations of sector-wide programs is a serious constraint in the debate on aid effectiveness. We propose a method of statistical impact evaluation in situations with heter...
Reliability assessment for safety critical systems by statistical random testing
In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs
A statistical model of uplink inter-cell interference with slow and fast power control mechanisms
Tabassum, Hina
2013-09-01
Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.
Statistical analysis about corrosion in nuclear power plants
Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)
Allen, Kirk
The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented
Yin Y. Shugart; Bing-Jian Feng; Andrew Collins
2002-11-01
We have evaluated the power for detecting a common trait determined by two loci, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single disease locus, we have used a simulated data set of general pedigrees in which a two-locus disease is segregating and evaluated several non-parametric linkage statistics implemented in the two programs. We found that the power for detecting linkage using the $S_{\\text{all}}$ statistic in GENEHUNTER (GH, version 2.1), implemented as statistic in SimWalk2 (version 2.82), is different in the two. The values associated with statistic output by SimWalk2 are consistently more conservative than those from GENEHUNTER except when the underlying model includes heterogeneity at a level of 50% where the values output are very comparable. On the other hand, when the thresholds are determined empirically under the null hypothesis, $S_{\\text{all}}$ in GENEHUNTER and statistic have similar power.
A statistical proposal for environmental impact assessment of development projects
Environmental impact assessment of development projects is a fundamental process, which main goal is to avoid that their construction and functioning, lead to serious and negative consequences on the environment. Some of the most important limitations of the models employed to assess environmental impacts, are the subjectivity of its parameters and weights, and the multicolineality among the variables, which represent high quantities of similar information. This work presents a multivariate statistical-based method that tries to diminish such limitations. For this purpose, environmental impact assessment, is valuated through different environmental impact attributes and environmental elements, synthesized in an environmental quality index (ICA in Spanish). ICA can be applied at different levels, such as at a project level, or applied only at a partial level on one or some environmental components.
Environmental Assessment for power marketing policy for Southwestern Power Administration
1993-12-01
Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described.
Environmental Assessment for power marketing policy for Southwestern Power Administration
Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described
Statistical analysis of fire events at US nuclear power plants
The concern about fires as a potential agent of common cause failure in NPPs has greatly increased since the Browns Ferry NPP fire. Several regulatory actions were initiated following this incident. In investigating the chances of fire incident leading to core melt it is found that the unconditional frequency is about 1x10 incidents per reactor-year. The detailed reviews of fire events at nuclear plants are used in quantifying fire occurrence frequency required to carry out fire risk assessment. In this work the results of a statistical analysis of 354 fire incidents at US NPPs in the period from January 1965 to June 1985 are presented to quantify fire occurrence frequency. The distribution of fire incidents between the different types of NPPs (PWR, BWR or HTGR), the mode of plant operation, the probable cause of fire, the type of detectors detect the incident, who extinguished the fire, suppression equipment, suppression agent, the initiating combustible, the component or components affected by fire are all analysed for the studied 354 fire incidents. More than 50% of the incidents occurred during the construction phase, in many of them there is neither nuclear problem nor any safety problem, however these incidents delayed the startup of the units up to 2 years as happened in Indian Point unit 2 (1971). There are four major fire incidents at US NPPS in the first period of the study (1965-1978), not one of them in the last seven years (1979-1985) which clarify the development in the fire protection measures and technology. The fire events in US (NPPS) can be summarized in about 354 incidents at 33 locations due to 38 causes of fire with 0.17 fire events/plant/year
The number of Guttman errors as a simple and powerful person-fit statistic
Meijer, Rob R.
1994-01-01
A number of studies have examined the power of several statistics that can be used to detect examinees with unexpected (nonfitting) item score patterns, or to determine person fit. This study compared the power of the U3 statistic with the power of one of the simplest person-fit statistics, the sum of the number of Guttman errors. In most cases studied, (a weighted version of) the latter statistic performed as well as the U3 statistic. Counting the number of Guttman errors seems to be a usefu...
Statistical methods for assessing agreement between continuous measurements
Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter
Background: Clinical research often involves study of agreement amongst observers. Agreement can be measured in different ways, and one can obtain quite different values depending on which method one uses. Objective: We review the approaches that have been discussed to assess the agreement betwee......-moment correlation coefficient (r) between the results of the two measurements methods as an indicator of agreement, which is wrong. There have been proposed several alternative methods, which we will describe together with preconditions for use of the methods....... continuous measures and discuss their strengths and weaknesses. Different methods are illustrated using actual data from the `Delay in diagnosis of cancer in general practice´ project in Aarhus, Denmark. Subjects and Methods: We use weighted kappa-statistic, intraclass correlation coefficient (ICC......), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product...
Nuclear power plant insurance - experience and loss statistics
Nuclear power plants are treated separately when concluding insurance contracts. National insurance pools have been established in industrial countries, co-operating on an international basis, for insuring a nuclear power plant. In combined property insurance, the nuclear risk is combined with the fire risk. In addition, there are the engineering insurances. Of these, the one of significance for nuclear power plants is the machinery insurance, which can be covered on the free insurance market. Nuclear power plants have had fewer instances of damage than other, conventional installations. (orig.)
Fundamentals of modern statistical methods substantially improving power and accuracy
Wilcox, Rand R
2001-01-01
Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...
Comparative environmental assessment of unconventional power installations
Sosnina, E. N.; Masleeva, O. V.; Kryukov, E. V.
2015-08-01
Procedure of the strategic environmental assessment of the power installations operating on the basis of renewable energy sources (RES) was developed and described. This procedure takes into account not only the operational process of the power installation but also the whole life cycles: from the production and distribution of power resources for manufacturing of the power installations to the process of their recovery. Such an approach gives an opportunity to make a more comprehensive assessment of the influence of the power installations on environments and may be used during adaptation of the current regulations and development of new regulations for application of different types of unconventional power installations with due account of the ecological factor. Application of the procedure of the integrated environmental assessment in the context of mini-HPP (Hydro Power Plant); wind, solar, and biogas power installations; and traditional power installation operating natural gas was considered. Comparison of environmental influence revealed advantages of new energy technologies compared to traditional ones. It is shown that solar energy installations hardly pollute the environment during operation, but the negative influence of the mining operations and manufacturing and utilization of the materials used for solar modules is maximum. Biogas power installations are on the second place as concerns the impact on the environment due to the considerable mass of the biogas installation and gas reciprocating engine. The minimum impact on the environment is exerted by the mini-HPP. Consumption of material and energy resources for the production of the traditional power installation is less compared to power installations on RES; however, this factor incomparably increases when taking into account the fuel extraction and transfer. The greatest impact on the environment is exerted by the operational process of the traditional power installations.
Statistical utility theory for comparison of nuclear versus fossil power plant alternatives
A statistical formulation of utility theory is developed for decision problems concerned with the choice among alternative strategies in electric energy production. Four alternatives are considered: nuclear power, fossil power, solar energy, and conservation policy. Attention is focused on a public electric utility thought of as a rational decision-maker. A framework for decisions is then suggested where the admissible strategies and their possible consequences represent the information available to the decision-maker. Once the objectives of the decision process are assessed, consequences can be quantified in terms of measures of effectiveness. Maximum expected utility is the criterion of choice among alternatives. Steps toward expected values are the evaluation of the multidimensional utility function and the assessment of subjective probabilities for consequences. In this respect, the multiplicative form of the utility function seems less restrictive than the additive form and almost as manageable to implement. Probabilities are expressed through subjective marginal probability density functions given at a discrete number of points. The final stage of the decision model is to establish the value of each strategy. To this scope, expected utilities are computed and scaled. The result is that nuclear power offers the best alternative. 8 figures, 9 tables, 32 references
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Thermohydraulic assessment of the RP-10 reactor core to determine the maximum power
Thermohydraulic parameters assessment of the RP-10 reactor core from the most thermally demanded (hot channel). Determination of the operation thermal maximum power considering security margins and statistical treatment of uncertainty factors
Assessing Landslide Risk Areas Using Statistical Models and Land Cover
Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.
2015-12-01
Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.
Heidel, R Eric
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717
Using the statistical analysis method to assess the landslide susceptibility
Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting
2015-04-01
This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.
Safety assessment of emergency power systems for nuclear power plants
This publication is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing the safety of a given design of the emergency power systems (EPS) for a nuclear power plant. The present publication refers closely to the NUSS Safety Guide 50-SG-D7 (Rev. 1), Emergency Power Systems at Nuclear Power Plants. It covers therefore exactly the same technical subject as that Safety Guide. In view of its objective, however, it attempts to help in the evaluation of possible technical solutions which are intended to fulfill the safety requirements. Section 2 clarifies the scope further by giving an outline of the assessment steps in the licensing process. After a general outline of the assessment process in relation to the licensing of a nuclear power plant, the publication is divided into two parts. First, all safety issues are presented in the form of questions that have to be answered in order for the assessor to be confident of a safe design. The second part presents the same topics in tabulated form, listing the required documentation which the assessor has to consult and those international and national technical standards pertinent to the topics. An extensive reference list provides information on standards. 1 tab
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Near and Far from Equilibrium Power-Law Statistics
Biro, Tamas S; Biro, Gabor; Shen, Ke Ming
2016-01-01
We analyze the connection between $p_T$ and multiplicity distributions in a statistical framework. We connect the Tsallis parameters, $T$ and $q$, to physical properties like average energy per particle and the second scaled factorial moment, $F_2=\\langle n(n-1) \\rangle / {\\langle n \\rangle}^2$, measured in multiplicity distributions. Near and far from equilibrium scenarios with master equations for the probability of having $n$ particles, $P_n$, are reviewed based on hadronization transition rates, $\\mu_n$, from $n$ to $n+1$ particles.
The issue of statistical power for overall model fit in evaluating structural equation models
Richard HERMIDA
2015-06-01
Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.
Statistical Power of Psychological Research: What Have We Gained in 20 Years?
Rossi, Joseph S.
1990-01-01
Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…
Azuaje Francisco
2006-06-01
Full Text Available Abstract Background The analysis of large-scale gene expression data is a fundamental approach to functional genomics and the identification of potential drug targets. Results derived from such studies cannot be trusted unless they are adequately designed and reported. The purpose of this study is to assess current practices on the reporting of experimental design and statistical analyses in gene expression-based studies. Methods We reviewed hundreds of MEDLINE-indexed papers involving gene expression data analysis, which were published between 2003 and 2005. These papers were examined on the basis of their reporting of several factors, such as sample size, statistical power and software availability. Results Among the examined papers, we concentrated on 293 papers consisting of applications and new methodologies. These papers did not report approaches to sample size and statistical power estimation. Explicit statements on data transformation and descriptions of the normalisation techniques applied prior to data analyses (e.g. classification were not reported in 57 (37.5% and 104 (68.4% of the methodology papers respectively. With regard to papers presenting biomedical-relevant applications, 41(29.1 % of these papers did not report on data normalisation and 83 (58.9% did not describe the normalisation technique applied. Clustering-based analysis, the t-test and ANOVA represent the most widely applied techniques in microarray data analysis. But remarkably, only 5 (3.5% of the application papers included statements or references to assumption about variance homogeneity for the application of the t-test and ANOVA. There is still a need to promote the reporting of software packages applied or their availability. Conclusion Recently-published gene expression data analysis studies may lack key information required for properly assessing their design quality and potential impact. There is a need for more rigorous reporting of important experimental
Power-law distributions in economics: a nonextensive statistical approach
Queiros, S M D; Tsallis, C; Queiros, Silvio M. Duarte; Anteneodo, Celia; Tsallis, Constantino
2005-01-01
The cornerstone of Boltzmann-Gibbs ($BG$) statistical mechanics is the Boltzmann-Gibbs-Jaynes-Shannon entropy $S_{BG} \\equiv -k\\int dx f(x)\\ln f(x)$, where $k$ is a positive constant and $f(x)$ a probability density function. This theory has exibited, along more than one century, great success in the treatment of systems where short spatio/temporal correlations dominate. There are, however, anomalous natural and artificial systems that violate the basic requirements for its applicability. Different physical entropies, other than the standard one, appear to be necessary in order to satisfactorily deal with such anomalies. One of such entropies is $S_q \\equiv k (1-\\int dx [f(x)]^q)/(1-q)$ (with $S_1=S_{BG}$), where the entropic index $q$ is a real parameter. It has been proposed as the basis for a generalization, referred to as {\\it nonextensive statistical mechanics}, of the $BG$ theory. $S_q$ shares with $S_{BG}$ four remarkable properties, namely {\\it concavity} ($\\forall q>0$), {\\it Lesche-stability} ($\\for...
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Statistical modeling and analysis of the influence of antenna polarization error on received power
无
2002-01-01
The problem of statistical modeling of antenna polarization error is studied and the statistical characteristics of antenna's received power are analyzed. A novel Stokes-vector-based method is presented to describe the conception of antenna's polarization purity. Statistical model of antenna's polarization error in polarization domain is then built up. When an antenna with polarization error of uniform distribution is illuminated by an arbitrary polarized incident field, the probability density of antenna's received power is derived analytically. Finally, a group of curves of deviation and standard deviation of received power are plotted numerically.
Estimating statistical power for open-enrollment group treatment trials.
Morgan-Lopez, Antonio A; Saavedra, Lissette M; Hien, Denise A; Fals-Stewart, William
2011-01-01
Modeling turnover in group membership has been identified as a key barrier contributing to a disconnect between the manner in which behavioral treatment is conducted (open-enrollment groups) and the designs of substance abuse treatment trials (closed-enrollment groups, individual therapy). Latent class pattern mixture models (LCPMMs) are emerging tools for modeling data from open-enrollment groups with membership turnover in recently proposed treatment trials. The current article illustrates an approach to conducting power analyses for open-enrollment designs based on the Monte Carlo simulation of LCPMM models using parameters derived from published data from a randomized controlled trial comparing Seeking Safety to a Community Care condition for women presenting with comorbid posttraumatic stress disorder and substance use disorders. The example addresses discrepancies between the analysis framework assumed in power analyses of many recently proposed open-enrollment trials and the proposed use of LCPMM for data analysis. PMID:20832971
Ground assessment methods for nuclear power plant
It is needless to say that nuclear power plant must be constructed on the most stable and safe ground. Reliable assessment method is required for the purpose. The Ground Integrity Sub-committee of the Committee of Civil Engineering of Nuclear Power Plant started five working groups, the purpose of which is to systematize the assessment procedures including geological survey, ground examination and construction design. The works of working groups are to establishing assessment method of activities of faults, standardizing the rock classification method, standardizing assessment and indication method of ground properties, standardizing test methods and establishing the application standard for design and construction. Flow diagrams for the procedures of geological survey, for the investigation on fault activities and ground properties of area where nuclear reactor and important outdoor equipments are scheduled to construct, were established. And further, flow diagrams for applying investigated results to design and construction of plant, and for determining procedure of liquidification nature of ground etc. were also established. These systematized and standardized methods of investigation are expected to yield reliable data for assessment of construction site of nuclear power plant and lead to the safety of construction and operation in the future. In addition, the execution of these systematized and detailed preliminary investigation for determining the construction site of nuclear power plant will make much contribution for obtaining nation-wide understanding and faith for the project. (Ishimitsu, A.)
Garfield, Joan; delMas, Robert
2010-01-01
The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related t...
Statistical study of high energy radiation from rotation-powered pulsars
无
2000-01-01
Based on our self-consistent outer gap model for high energy emission from the rotation-powered pulsars, we study the statistical properties of X-ray and γ-ray emission from the rotation-powered pulsars, and other statistical properties (e.g. diffuse γ-ray background and unidentified γ-ray point sources) related to γ-ray pulsars in our Galaxy and nearby galaxies are also considered.
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO2-emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Violation of statistical isotropy and homogeneity in the 21-cm power spectrum
Shiraishi, Maresuke; Kamionkowski, Marc; Raccanelli, Alvise
2016-01-01
Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic-Microwave-Background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm-line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving u...
SIESE - trimestrial bulletin - Synthesis 1995. Electric power summary statistics for Brazil
This bulletin presents the electric power summary statistics, which cover the performance of the power system for the whole of the utilities in 1995. It offers tables with revised data concerning the last two years based on updated information supplied by both the electric utilities and the SIESE's responsibility centers. 6 figs., 36 tabs
The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study
Dong, Nianbo; Lipsey, Mark
2010-01-01
This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating the...
On the power for linkage detection using a test based on scan statistics.
Hernández, Sonia; Siegmund, David O; de Gunst, Mathisca
2005-04-01
We analyze some aspects of scan statistics, which have been proposed to help for the detection of weak signals in genetic linkage analysis. We derive approximate expressions for the power of a test based on moving averages of the identity by descent allele sharing proportions for pairs of relatives at several contiguous markers. We confirm these approximate formulae by simulation. The results show that when there is a single trait-locus on a chromosome, the test based on the scan statistic is slightly less powerful than that based on the customary allele sharing statistic. On the other hand, if two genes having a moderate effect on a trait lie close to each other on the same chromosome, scan statistics improve power to detect linkage. PMID:15772104
Ten-year statistics of the electric power supply. Status and tendencies
The ten-year statistics of the electric power supply in Denmark for 1992-2001 presents in tables and figures the trend of the electric power supply sector during the last ten years. The tables and figures present information on total energy consumption, combined heat and power generation, fuel consumption and the environment, the technical systems, economy and pricing, organization of the electricity supply, and information on electricity prices and taxes for households and industry in various countries. (LN)
On detection and assessment of statistical significance of Genomic Islands
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
Nuclear power plant security assessment technical manual.
O' Connor, Sharon L.; Whitehead, Donnie Wayne; Potter, Claude S., III
2007-09-01
This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and
Nuclear power plant security assessment technical manual
This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and
Statistical-Based Joint Power Control for Wireless Ad Hoc CDMA Networks
ZHANGShu; RONGMongtian; CHENBo
2005-01-01
Current power control algorithm for CDMA-based ad hoc networks contains SIR and interference measurement, which is based on history information. However, for the traffics in today's or future's network, important statistical property is burstiness. As a consequence, the interference at a given receiving node may fluctuate dramatically. So the convergence speed of power control is not fast and the performance degrades. This paper presents a joint power control model. To a receiving node, all transmitting nodes assigned in same time slot adjust their transmitter power based on current information, which takes into account the adjustments of transmitter power of other transmitting nodes. Based on the joint power control model, this paper proposes a statisticalbased power control algorithm. Through this new algorithm, the interference is estimated more exactly. The simulation results indicated that the proposed power control algorithm outperforms the old algorithm.
Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum
Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd
2011-01-01
The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…
Assessing Knowledge Structures in a Constructive Statistical Learning Environment
P.P.J.L. Verkoeijen (Peter); Tj. Imbos; M.W.J. van de Wiel (Margje); M.P.F. Berger; H.G. Schmidt (Henk)
2002-01-01
textabstractIn this report, the method of free recall is put forward as a tool to evaluate a prototypical statistical learning environment. A number of students from the faculty of Health Sciences, Maastricht University, the Netherlands, were required to write down whatever they could remember of a
Computer-aided assessment in statistics: the CAMPUS project
Hunt, Neville
1998-01-01
The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assess...
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
Cheong, JeeWon
2016-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including sample size, effect size of mediated effect, number of measurement occasions, and R2 of measured variables. In general, the results showed that relatively large samples were needed to accurately estimate the mediated effects and to have adequate statistical power, when testing mediation in the LGCM framework. Guidelines for designing studies to examine longitudinal mediation and ways to improve the accuracy of the estimates and statistical power were discussed.
Using DEWIS and R for Multi-Staged Statistics e-Assessments
Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.
2016-01-01
We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…
Geotechnical assessments of upgrading power transmission lines
Smith, Andrew [Coffey Geotechnics Ltd., Harrogate (United Kingdom)
2012-11-01
One of the consequences of increasing demand for energy is a corresponding requirement for increased energy distribution. This trend is likely to be magnified by the current tendency to generate power in locations remote from centres of population. New power transmission routes are expensive and awkward to develop, and there are therefore benefits to be gained by upgrading existing routes. However, this in turn raises problems of a different nature. The re-use of any structure must necessarily imply the acceptance of unknowns. The upgrading of transmission lines is no exception to this, particularly when assessing foundations, which in their nature are not visible. A risk-based approach is therefore used. This paper describes some of the geotechnical aspects of the assessment of electric power transmission lines for upgrading. It briefly describes the background, then discusses some of the problems encountered and the methods used to address them. These methods are based mainly on information obtained from desk studies and walkover surveys, with a limited amount of intrusive investigation. (orig.)
Statistical power analysis a simple and general model for traditional and modern hypothesis tests
Murphy, Kevin R; Wolach, Allen
2014-01-01
Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g
Statistical and RBF NN models : providing forecasts and risk assessment
Marček, Milan
2009-01-01
Forecast accuracy of economic and financial processes is a popular measure for quantifying the risk in decision making. In this paper, we develop forecasting models based on statistical (stochastic) methods, sometimes called hard computing, and on a soft method using granular computing. We consider the accuracy of forecasting models as a measure for risk evaluation. It is found that the risk estimation process based on soft methods is simplified and less critical to the question w...
JRC Statistical Assessment of the 2015 ICT Development Index
SAISANA Michaela; DOMINGUEZ TORREIRO MARCOS
2015-01-01
Since 2009, the International Telecommunication Union (ITU) has been publishing its annual ICT Development Index (IDI), which benchmarks countries’ performance with regard to ICT infrastructure, use and skills. The JRC analysis, conducted at ITU’s invitation, suggests that the conceptualized three-level structure of the 2015 IDI is statistically sound in terms of coherence and balance, with the overall index as well as the three sub-indices – on ICT access, use and skills – being driven ...
Assessing the South African Brain Drain, a Statistical Comparison
Jean-Baptiste Meyer; Mercy Brown; David Kaplan
2000-01-01
For several decades the analysis of the so-called brain drain has been hampered by measurement problems. It is now recognised that the official figures significantly underestimate the extent of the brain drain phenomenon and its increase since the political changes in the mid-1990's. This paper, using data from various reliable sources, provides new statistical evidence on the size of the brain drain from South Africa. It compares two methods used to arrive at a more realistic picture of the ...
Climate change assessment for Mediterranean agricultural areas by statistical downscaling
Palatella, L.; Miglietta, M. M.; Paradisi, P.; Lionello, P.
2010-01-01
In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy), Ebro river basin (Spain), Po valley (Italy) and An- talya province (Turkey). We performed the statistical down- scaling using Canonical Correlation Analysis (CCA) in two versions: in one case Principal Component Analysis (PCA) filter is applied only to predictor and in the other to both pre- dictor and predictand. After performing a validation test, CCA after PCA filter on both ...
Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.
Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G
2016-04-01
Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting. PMID:27209009
Condition assessment of electrical power plant
The large investments associated with main equipment items in electric power plants, both in terms of acquisition and conservation and in aspects such as safety, make it increasingly necessary and profitable to implement techniques for the monitoring and predictive assessment of the state of these equipment items. This paper highlights the benefits of applying such a programme to large electric equipment, describing in detail the technologies available for the evaluation and followup of the state of insulation, and the mechanical characteristics of large transformer windings. There is also a description of real cases where these technologies are used, showing the results obtained on equipment items which are in good condition and those which are damaged. The paper finally addresses actions resulting from these evaluation programmes, and applicable conclusions based on the large number of inspection techniques and tools that power plants can use nowadays to ensure continuous, reliable operation with optimised performance and reduced operating costs. (Author)
Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve
A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)
Narayan, Manjari; Allen, Genevera I
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
Manjari eNarayan
2016-04-01
Full Text Available Many complex brain disorders such as Autism Spectrum Disorders exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches --- R^2 based on resampling and random effects test statistics, and R^3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R^2 and R^3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in Autism Spectrum Disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.
Narayan, Manjari; Allen, Genevera I.
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.
The power of alternative assessments (AAs)
张千茜
2013-01-01
This article starts by discussing the potential disadvantages of traditional assessment towards young English as a Second Language (ESL) learners within the American public school education system. In response to such disadvantages, researchers ’call for the implementation of alternative assessments (AAs) is therefore introduced along with the various benefits of AAs. However, the current mainstream education policy in the US, namely No Child Left Behind (NCLB) Policy, is still largely based on the tra-ditional ways of testing, making policy-oriented implementation of AAs on large scales remarkably difficult. After careful analysis, the author points out several implications concerning how, under such an existing policy of NCLB, can practitioners effectively accommodate young ESL learners by applying the power of AAs.
Kunio Nakamura
2014-01-01
Full Text Available Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials.
Socol, Yehoshua; Dobrzyński, Ludwik
2015-01-01
The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required. PMID:26673526
Statistically Based Approach to Broadband Liner Design and Assessment
Nark, Douglas M. (Inventor); Jones, Michael G. (Inventor)
2016-01-01
A broadband liner design optimization includes utilizing in-duct attenuation predictions with a statistical fan source model to obtain optimum impedance spectra over a number of flow conditions for one or more liner locations in a bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners having impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increasing weighting to specific frequencies and/or operating conditions. One or more broadband design approaches are utilized to produce a broadband liner that targets a full range of frequencies and operating conditions.
In vivo Comet assay – statistical analysis and power calculations of mice testicular cells
Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne;
2014-01-01
The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary...... statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to......-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A...
A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling
Tabassum, Hina
2012-10-03
This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.
Computer-aided assessment in statistics: the CAMPUS project
Neville Hunt
1998-12-01
Full Text Available The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assessor. While computer-aided assessment (CAA has an enthusiastic following, it is not clear to many teachers that it either reduces workloads or reduces the risk of cheating. In an ideal world, most teachers would prefer to give individual attention and personal feedback to each student when marking their work. In this sense CAA must be seen as second best and will therefore be used only if it is seen to offer significant benefits in terms of reduced workloads or increased validity.
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
Cheong, JeeWon
2011-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including…
Using Classroom Assessment Techniques in an Introductory Statistics Class
Goldstein, Gary S.
2007-01-01
College instructors often provide students with only summative evaluations of their work, typically in the form of exam scores or paper grades. Formative evaluation, such as classroom assessment techniques (CATs), are rarer in higher education and provide an ongoing evaluation of students' progress. In this article, the author summarizes the use…
A Teacher's Guide to Assessment Concepts and Statistics
Newman, Carole; Newman, Isadore
2013-01-01
The concept of teacher accountability assumes teachers will use data-driven decision making to plan and deliver appropriate and effective instruction to their students. In order to do so, teachers must be able to accurately interpret the data that is given to them, and that requires the knowledge of some basic concepts of assessment and…
Environmental assessment of submarine power cables
Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar
Environmental assessment of submarine power cables
Isus, Daniel; Martinez, Juan D. [Grupo General Cable Sistemas, S.A., 08560-Manlleu, Barcelona (Spain); Arteche, Amaya; Del Rio, Carmen; Madina, Virginia [Tecnalia Research and Innovation, 20009 San Sebastian (Spain)
2011-03-15
Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar
A comprehensive statistical assessment of star-planet interaction
Miller, Brendan P; Wright, Jason T; Pearson, Elliott G
2014-01-01
We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d450 Mjup/AU^2, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in Lx. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence.
Violation of statistical isotropy and homogeneity in the 21-cm power spectrum
Shiraishi, Maresuke; Muñoz, Julian B.; Kamionkowski, Marc; Raccanelli, Alvise
2016-05-01
Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic microwave background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here, we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving upon the current CMB constraints by ˜2 orders of magnitude for a dipolar asymmetry and by ˜1 - 3 orders of magnitude for a quadrupolar asymmetry case.
Statistical analysis of human maintenance failures of a nuclear power plant
In this paper, a statistical study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components are especially prone to human failures. Many human failures were found in safety related systems. Similarly, several failures remained latent from outages to power operation. The safety significance was generally small. Modifications are an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more, in future. (orig.)
Statistical aspects of bioequivalence assessment in the pharmaceutical industry.
Patterson, S. D.
2003-01-01
Since the early 1990's, average bioequivalence studies have served as the international standard for demonstrating that two formulations of drug product will provide the same therapeutic benefit and safety profile when used in the marketplace. Population (PBE) and Individual (IBE) bioequivalence have been the subject of intense international debate since methods for their assessment were proposed in the late 1980's. Guidance has been proposed by the Food and Drug Administration...
Schneider, Jesper Wiborg
. By use of statistical power analyses and demonstration of effect sizes, we emphasize that importance of empirical findings lies in “differences that make a difference” and not statistical significance tests per se. Finally we discuss the crucial assumption of randomness and question the presumption...
Brewster, Zachary W
2012-01-01
Despite popular claims that racism and discrimination are no longer salient issues in contemporary society, racial minorities continue to experience disparate treatment in everyday public interactions. The context of full-service restaurants is one such public setting wherein racial minority patrons, African Americans in particular, encounter racial prejudices and discriminate treatment. To further understand the causes of such discriminate treatment within the restaurant context, this article analyzes primary survey data derived from a community sample of servers (N = 200) to assess the explanatory power of one posited explanation—statistical discrimination. Taken as a whole, findings suggest that while a statistical discrimination framework toward understanding variability in servers’ discriminatory behaviors should not be disregarded, the framework’s explanatory utility is limited. Servers’ inferences about the potential profitability of waiting on customers across racial groups explain little of the overall variation in subjects’ self-reported discriminatory behaviors, thus suggesting that other factors not explored in this research are clearly operating and should be the focus of future inquires. PMID:22379609
Climate change assessment for Mediterranean agricultural areas by statistical downscaling
Palatella, L.; Miglietta, M. M.; Paradisi, P.; Lionello, P.
2010-07-01
In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy), Ebro river basin (Spain), Po valley (Italy) and Antalya province (Turkey). We performed the statistical downscaling using Canonical Correlation Analysis (CCA) in two versions: in one case Principal Component Analysis (PCA) filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP) is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500) as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.
Climate change assessment for Mediterranean agricultural areas by statistical downscaling
L. Palatella
2010-07-01
Full Text Available In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy, Ebro river basin (Spain, Po valley (Italy and Antalya province (Turkey. We performed the statistical downscaling using Canonical Correlation Analysis (CCA in two versions: in one case Principal Component Analysis (PCA filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500 as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.
A COMPREHENSIVE STATISTICAL ASSESSMENT OF STAR-PLANET INTERACTION
We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d < 60 pc) planet-hosting solar analogs are combined with archival Chandra, XMM-Newton, and ROSAT coverage of 11 similar stars to construct a sample inoculated against inherent stellar class and planet-detection biases. Survival analysis and Bayesian regression methods (incorporating both measurements errors and X-ray upper limits; 13/23 stars have secure detections) are used to test whether ''hot Jupiter'' hosts are systematically more X-ray luminous than comparable stars with more distant or smaller planets. No significant correlations are present between common proxies for interaction strength (M P/a 2 or 1/a) versus coronal activity (L X or L X/L bol). In contrast, a sample of 198 FGK main-sequence stars does show a significant (∼99% confidence) increase in X-ray luminosity with M P/a 2. While selection biases are incontrovertibly present within the main-sequence sample, we demonstrate that the effect is primarily driven by a handful of extreme hot-Jupiter systems with M P/a 2 > 450 M Jup AU–2, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in L X. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence
A COMPREHENSIVE STATISTICAL ASSESSMENT OF STAR-PLANET INTERACTION
Miller, Brendan P.; Gallo, Elena; Pearson, Elliott G. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Wright, Jason T. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)
2015-02-01
We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d < 60 pc) planet-hosting solar analogs are combined with archival Chandra, XMM-Newton, and ROSAT coverage of 11 similar stars to construct a sample inoculated against inherent stellar class and planet-detection biases. Survival analysis and Bayesian regression methods (incorporating both measurements errors and X-ray upper limits; 13/23 stars have secure detections) are used to test whether ''hot Jupiter'' hosts are systematically more X-ray luminous than comparable stars with more distant or smaller planets. No significant correlations are present between common proxies for interaction strength (M {sub P}/a {sup 2} or 1/a) versus coronal activity (L {sub X} or L {sub X}/L {sub bol}). In contrast, a sample of 198 FGK main-sequence stars does show a significant (∼99% confidence) increase in X-ray luminosity with M {sub P}/a {sup 2}. While selection biases are incontrovertibly present within the main-sequence sample, we demonstrate that the effect is primarily driven by a handful of extreme hot-Jupiter systems with M {sub P}/a {sup 2} > 450 M {sub Jup} AU{sup –2}, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in L {sub X}. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence.
Waste Heat to Power Market Assessment
Elson, Amelia [ICF International, Fairfax, VA (United States); Tidball, Rick [ICF International, Fairfax, VA (United States); Hampson, Anne [ICF International, Fairfax, VA (United States)
2015-03-01
Waste heat to power (WHP) is the process of capturing heat discarded by an existing process and using that heat to generate electricity. In the industrial sector, waste heat streams are generated by kilns, furnaces, ovens, turbines, engines, and other equipment. In addition to processes at industrial plants, waste heat streams suitable for WHP are generated at field locations, including landfills, compressor stations, and mining sites. Waste heat streams are also produced in the residential and commercial sectors, but compared to industrial sites these waste heat streams typically have lower temperatures and much lower volumetric flow rates. The economic feasibility for WHP declines as the temperature and flow rate decline, and most WHP technologies are therefore applied in industrial markets where waste heat stream characteristics are more favorable. This report provides an assessment of the potential market for WHP in the industrial sector in the United States.
How Many Words Do You Know? An Integrated Assessment Task for Introductory Statistics Students
Warton, David I.
2007-01-01
A novel assignment exercise is described, in which students use a dictionary to estimate the size of their vocabulary. This task was developed for an introductory statistics service course, although it can be modified for use in survey sampling courses. The exercise can be used to simultaneously assess a range of core statistics skills: sample…
Assessing the performance of statistical validation tools for megavariate metabolomics data
Rubingh, C.M.; Bijlsma, S.; Derks, E.P.P.A.; Bobeldijk, I.; Verheij, E.R.; Kochhar, S.; Smilde, A.K.
2006-01-01
Statistical model validation tools such as cross-validation, jack-knifing model parameters and permutation tests are meant to obtain an objective assessment of the performance and stability of a statistical model. However, little is known about the performance of these tools for megavariate data set
Use of Statistical Information for Damage Assessment of Civil Engineering Structures
Kirkegaard, Poul Henning; Andersen, P.
This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...
Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions
Pisarenko, V. F.; Sornette, D.
2011-01-01
We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent c...
Statistical power of model selection strategies for genome-wide association studies.
Zheyang Wu
2009-07-01
Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the
Hacke, Peter; Spataru, Sergiu
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated...... stress temperature, their use to determine the maximum power at 25°C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power...
Development of nuclear power plant online monitoring system using statistical quality control
Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability
A Statistical Approach to Planning Reserved Electric Power for Railway Infrastructure Administration
M. Brabec; Pelikán, E. (Emil); Konár, O. (Ondřej); Kasanický, I.; Juruš, P. (Pavel); Sadil, J.; Blažek, P.
2013-01-01
One of the requirements on railway infrastructure administration is to provide electricity for day-to-day operation of railways. We propose a statistically based approach for the estimation of maximum 15-minute power within a calendar month for a given region. This quantity serves as a basis of contracts between railway infrastructure administration and electricity distribution system operator. We show that optimization of the prediction is possible, based on underlying loss function deriv...
Lee, Chaeyoung
2012-01-01
Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, wi...
Assessment of ceramic composites for MMW space nuclear power systems
Proposed multimegawatt nuclear power systems which operate at high temperatures, high levels of stress, and in hostile environments, including corrosive working fluids, have created interest in the use of ceramic composites as structural materials. This report assesses the applicability of several ceramic composites in both Brayton and Rankine cycle power systems. This assessment considers an equilibrium thermodynamic analysis and also a nonequilibrium assessment. (FI)
The physical mechanisms that contribute to atmospheric breakdown induced by high power microwaves (HPMs) are of particular interest for the further development of high power microwave systems and related technologies. For a system in which HPM is produced in a vacuum environment for the purpose of radiating into atmosphere, it is necessary to separate the atmospheric environment from the vacuum environment with a dielectric interface. Breakdown across this interface on the atmospheric side and plasma development to densities prohibiting further microwave propagation are of special interest. In this paper, the delay time between microwave application and plasma emergence is investigated. Various external parameters, such as UV illumination or the presence of small metallic points on the surface, provide sources for electron field emission and influence the delay time which yields crucial information on the breakdown mechanisms involved. Due to the inherent statistical appearance of initial electrons and the statistics of the charge carrier amplification mechanisms, the flashover delay times deviate by as much as ±50% from the average, for the investigated case of discharges in N2 at pressures of 60-140 Torr and a microwave frequency of 2.85 GHz with 3 μs pulse duration, 50 ns pulse risetime, and MW/cm2 power densities. The statistical model described in this paper demonstrates how delay times for HPM surface flashover events can be effectively predicted for various conditions given sufficient knowledge about ionization rate coefficients as well as the production rate for breakdown initiating electrons.
Statistical Design Model (SDM) of power supply and communication subsystem's Satellite
Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza
In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem
Wind Power Assessment Based on a WRF Wind Simulation with Developed Power Curve Modeling Methods
Zhenhai Guo; Xia Xiao
2014-01-01
The accurate assessment of wind power potential requires not only the detailed knowledge of the local wind resource but also an equivalent power curve with good effect for a local wind farm. Although the probability distribution functions (pdfs) of the wind speed are commonly used, their seemingly good performance for distribution may not always translate into an accurate assessment of power generation. This paper contributes to the development of wind power assessment based on the wind speed...
Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination
Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia
2015-01-01
Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…
Read, S.; Bath, P.A.; Willett, P.; Maheswaran, R.
2009-01-01
This paper concerns the Bernoulli version of Kulldorff’s spatial scan statistic, and how accurately it identifies the exact centre of approximately circular regions of increased spatial density in point data. We present an alternative method of selecting circular regions that appears to give greater accuracy. Performance is tested in an epidemiological context using manifold synthetic case-control datasets. A small, but statistically significant, improvement is reported. The power of the alte...
LUO Zuying
2007-01-01
With technology scaling into nanometer regime, rampant process variations impact visible influences on leakage power estimation of very large scale integrations (VLSIs). In order to deal with the case of large inter- and intra-die variations, we induce a novel theory prototype of the statistical leakage power analysis (SLPA) for function blocks. Because inter-die variations can be pinned down into a small range but the number of gates in function blocks is large(>1000), we continue to simplify the prototype. At last, we induce the efficient methodology of SLPA. The method can save much running time for SLPA in the low power design since it is of the local-updating advantage. A large number of experimental data show that the method only takes feasible running time (0.32 s) to obtain accurate results (3 σ-error <0.5% on maximum) as function block circuits simultaneous suffer from 7.5%(3 σ/mean) inter-die and 7.5% intra-die length variations, which demonstrates that our method is suitable for statistical leakage power analysis of VLSIs under rampant process variations.
Dose assessments in nuclear power plant siting
This document is mainly intended to provide information on dose estimations and assessments for the purpose of nuclear power plant (NPP) siting. It is not aimed at giving radiation protection guidance, criteria or procedures to be applied during the process of NPP siting nor even to provide recommendations on this subject matter. The document may however be of help for implementing some of the Nuclear Safety Standards (NUSS) documents on siting. The document was prepared before April 26, 1986, when a severe accident at the Unit 4 of Chernobyl NPP in the USSR had occurred. It should be emphasized that this document does not bridge the gap which exists in the NUSS programme as far as radiation protection guidance for the specific case of siting of NPP is concerned. The Agency will continue to work on this subject with the aim to prepare a safety series document on radiation protection requirements for NPP siting. This document could serve as a working document for this purpose. Refs, figs and tabs
Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery
2009-01-01
In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... is low. Tests based solely on the distribution of allele frequencies or the site frequency spectrum, such as the Ewens-Watterson test or Tajima's D, have less power in detecting both positive and negative selection because of the transient nature of positive selection and the weak signal left by negative...
Efficient statistical analysis method of power/ground (P/G) network
Zuying Luo; Sheldon X.D. Tan
2008-01-01
In this paper, we propose an incremental statistical analysis method with complexity reduction as a pre-process for on-chip power/ground (P/G) networks. The new method exploits locality of P/G network analyses and aims at P/G networks with a large number of strongly connected subcircuits (called strong connects) such as trees and chains. The method consists of three steps. First it compresses P/G circuits by removing strong connects. As a result, current variations (CVs) of nodes in strong connects are transferred to some remain-ing nodes. Then based on the locality of power grid voltage responses to its current inputs, it efficiently calculates the correlative resistor (CR) matrix in a local way to directly compute the voltage variations by using small parts of the remaining circuit. Last it statistically recovers voltage variations of the suppressed nodes inside strong connects. This new method for statistically compressing and expanding strong connects in terms of current or voltage variations in a closed form is very efficient owning to its property of incremental analysis. Experimental results demonstrate that the method can efficiently compute low-bounds of voltage variations for P/G networks and it has two or three orders of magnitudes speedup over the traditional Monte-Carlo-based simulation method, with only 2.0% accuracy loss.
Assessment - A Powerful Lever for Learning
Lorna Earl
2010-05-01
Full Text Available Classroom assessment practices have been part of schooling for hundreds of years. There are, however, new findings about the nature of learning and about the roles that assessment can play in enhancing learning for all students. This essay provides a brief history of the changing role of assessment in schooling, describes three different purposes for assessment and foreshadows some implications that shifting to a more differentiated view of assessment can have for policy, practice and research.
Chris J. Skinner
2007-01-01
The paper establishes a correspondence between statistical disclosure control and forensic statistics regarding their common use of the concept of ‘probability of identification’. The paper then seeks to investigate what lessons for disclosure control can be learnt from the forensic identification literature. The main lesson that is considered is that disclosure risk assessment cannot, in general, ignore the search method that is employed by an intruder seeking to achieve disclosure. The effe...
New statistical potential for quality assessment of protein models and a survey of energy functions
Rykunov Dmitry; Fiser Andras
2010-01-01
Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions,...
Theoretical Foundations and Mathematical Formalism of the Power-Law Tailed Statistical Distributions
Giorgio Kaniadakis
2013-09-01
Full Text Available We present the main features of the mathematical theory generated by the √ κ-deformed exponential function expκ(x = ( 1 + κ2x2 + κx1/κ, with 0 ≤ κ < 1, developed in the last twelve years, which turns out to be a continuous one parameter deformation of the ordinary mathematics generated by the Euler exponential function. The κ-mathematics has its roots in special relativity and furnishes the theoretical foundations of the κ-statistical mechanics predicting power law tailed statistical distributions, which have been observed experimentally in many physical, natural and artificial systems. After introducing the κ-algebra, we present the associated κ-differential and κ-integral calculus. Then, we obtain the corresponding κ-exponential and κ-logarithm functions and give the κ-version of the main functions of the ordinary mathematics.
The objective of the Coastal Habitat Injury Assessment study was to document and quantify injury to biota of the shallow subtidal, intertidal, and supratidal zones throughout the shoreline affected by oil or cleanup activity associated with the Exxon Valdez oil spill. The results of these studies were to be used to support the Trustee's Type B Natural Resource Damage Assessment under the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). A probability based stratified random sample of shoreline segments was selected with probability proportional to size from each of 15 strata (5 habitat types crossed with 3 levels of potential oil impact) based on those data available in July, 1989. Three study regions were used: Prince William Sound, Cook Inlet/Kenai Peninsula, and Kodiak/Alaska Peninsula. A Geographic Information System was utilized to combine oiling and habitat data and to select the probability sample of study sites. Quasi-experiments were conducted where randomly selected oiled sites were compared to matched reference sites. Two levels of statistical inferences, philosophical bases, and limitations are discussed and illustrated with example data from the resulting studies. 25 refs., 4 figs., 1 tab
Generation of statistical scenarios of short-term wind power production
Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd; Nielsen, Henrik Aalborg
2007-01-01
Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform on the...... development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...
Application of probabilistic safety assessment for Macedonian electric power system
Due to the complex and integrated nature of a power system, failures in any part of the system can cause interruptions, which range from inconveniencing a small number of local residents to a major and widespread catastrophic disruption of supply known as blackout. The objective of the paper is to show that the methods and tools of probabilistic safety assessment are applicable for assessment and improvement of real power systems. The method used in this paper is developed based on the fault tree analysis and is adapted for the power system reliability analysis. A particular power system i.e. the Macedonian power system is the object of the analysis. The results show that the method is suitable for application of real systems. The reliability of Macedonian power system assumed as the static system is assessed. The components, which can significantly impact the power system are identified and analysed in more details. (author)
Hybrid algorithm for rotor angle security assessment in power systems
D. Prasad Wadduwage; Udaya D. Annakkage; Christine Qiong Wu
2015-01-01
Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA) in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on ...
Quan Hude; Li Bing; Fong Andrew; Lu Mingshan
2006-01-01
Abstract Background We assessed the linkage and correct linkage rate using deterministic record linkage among three commonly used Canadian databases, namely, the population registry, hospital discharge data and Vital Statistics registry. Methods Three combinations of four personal identifiers (surname, first name, sex and date of birth) were used to determine the optimal combination. The correct linkage rate was assessed using a unique personal health number available in all three databases. ...
Lombard, Martani J; Steyn, Nelia P; Charlton, Karen E; Senekal, Marjanne
2015-01-01
Background Several statistical tests are currently applied to evaluate validity of dietary intake assessment methods. However, they provide information on different facets of validity. There is also no consensus on types and combinations of tests that should be applied to reflect acceptable validity for intakes. We aimed to 1) conduct a review to identify the tests and interpretation criteria used where dietary assessment methods was validated against a reference method and 2) illustrate the ...
Assessing power grid reliability using rare event simulation
Wadman, Wander
2015-01-01
Renewable energy generators such as wind turbines and solar panels supply more and more power in modern electrical grids. Although the transition to a sustainable power supply is desirable, considerable implementation of distributed and intermittent generators may strain the power grid. Since grid operators are responsible for a highly reliable power grid, they want to estimate to what extent violations of grid stability constraints occur. To assess grid reliability over a period of interest,...
Grounding Locations Assessment of Practical Power System
Kousay Abdul Sattar; Ahmed M.A. Haidar; Nadheer A. Shalash
2012-01-01
Grounding Points (GPs) are installed in electrical power system to drive protective devices and accomplish the person nel safety. The general grounding problem is to find the optimal locations of these points so that the security and reli ability of power system can be improved. This paper presents a practical approach to find the optimal location of GPs based on the ratios of zero sequence reactance with positive sequence reactance (X0/X1), zero sequence resistance with positive sequence rea...
Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies
Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle
2016-01-01
Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…
Sebastianelli, Rose; Tamimi, Nabil
2011-01-01
Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…
One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...
Air-chemistry "turbulence": power-law scaling and statistical regularity
H.-m. Hsu
2011-08-01
Full Text Available With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO_{2}, NO_{x}, CO, SO_{2}, and O_{3} and aerosol (PM_{10} measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days peaks are also identified with the continuous wavelet transform (CWT. The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of −5/3 and −1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, −3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.
Specification of life cycle assessment in nuclear power plants
Life Cycle Assessment is an environmental management tool for assessing the environmental impacts of a product of a process. life cycle assessment involves the evaluation of environmental impacts through all stages of life cycle of a product or process. In other words life cycle assessment has a cradle to graveapproach. Some results of life cycle assessment consist of pollution prevention, energy efficient system, material conservation, economic system and sustainable development. All power generation technologies affect the environment in one way or another. The main environmental impact does not always occur during operation of power plant. The life cycle assessment of nuclear power has entailed studying the entire fuel cycle from mine to deep repository, as well as the construction, operation and demolition of the power station. Nuclear power plays an important role in electricity production for several countries. even though the use of nuclear power remains controversial. But due to the shortage of fossil fuel energy resources many countries have started to try more alternation to their sources of energy production. A life cycle assessment could detect all environmental impacts of nuclear power from extracting resources, building facilities and transporting material through the final conversion to useful energy services
Knowledge based system for fouling assessment of power plant boiler
The paper presents the design of an expert system for fouling assessment in power plant boilers. It is an on-line expert system based on selected criteria for the fouling assessment. Using criteria for fouling assessment based on 'clean' and 'not-clean' radiation heat flux measurements, the diagnostic variable are defined for the boiler heat transfer surface. The development of the prototype knowledge-based system for fouling assessment in power plants boiler comprise the integrations of the elements including knowledge base, inference procedure and prototype configuration. Demonstration of the prototype knowledge-based system for fouling assessment was performed on the Sines power plant. It is a 300 MW coal fired power plant. 12 fields are used with 3 on each side of boiler
Safety Assessment - Swedish Nuclear Power Plants
After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs
Safety Assessment - Swedish Nuclear Power Plants
Kjellstroem, B. [Luleaa Univ. of Technology (Sweden)
1996-12-31
After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs.
Methods of assessing nuclear power plant risks
The concept of safety evalution is based on safety criteria -standards or set qualitative values of parameters and indices used in designing nuclear power plants, incorporating demands on the quality of equipment and operation of the plant, its siting and technical means for achieving nuclear safety. The concepts are presented of basic and optimal risk values. Factors are summed up indispensable for the evaluation of the nuclear power plant risk and the present world trend of evaluation based on probability is discussed. (J.C.)
Jensen, P; Krogsgaard, M R; Christiansen, J;
1995-01-01
PURPOSE: The aim of this study was to establish the intraobserver and interobserver variability in the assessment of histologic type (tubular, villous, and tubulovillous) and grade of cytologic dysplasia (mild, moderate, and severe) in colorectal adenomas. METHODS: One hundred eighty-seven slides...... of adenomas were assessed twice by three experienced pathologists, with an interval of two months. Results were analyzed using kappa statistics. RESULTS: For agreement between first and second assessment (both type and grade of dysplasia), kappa values for the three specialists were 0.5345, 0...
Statistical analysis of regional capital and operating costs for electric power generation
Sanchez, L.R.; Myers, M.G.; Herrman, J.A.; Provanizano, A.J.
1977-10-01
This report presents the results of a three and one-half-month study conducted for Brookhaven National Lab. to develop capital and operating cost relationships for seven electric power generating technologies: oil-, coal-, gas-, and nuclear-fired steam-electric plants, hydroelectric plants, and gas-turbine plants. The methodology is based primarily on statistical analysis of Federal Power Commission data for plant construction and annual operating costs. The development of cost-output relationships for electric power generation is emphasized, considering the effects of scale, technology, and location on each of the generating processes investigated. The regional effects on cost are measured at the Census Region level to be consistent with the Brookhaven Multi-Regional Energy and Interindustry Regional Model of the United States. Preliminary cost relationships for system-wide costs - transmission, distribution, and general expenses - were also derived. These preliminary results cover the demand for transmission and distribution capacity and operating and maintenance costs in terms of system-service characteristics. 15 references, 6 figures, 23 tables.
JiYeoun Lee
2009-01-01
Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.
Kim, Man Cheol; Jang, Seung Cheol [KAERI, Daejeon (Korea, Republic of)
2011-08-15
For the purpose of developing a consensus method for the reliability assessment of safety-critical digital instrumentation and control systems in nuclear power plants, several high level issues in reliability assessment of the safety-critical software based on Bayesian belief network modeling and statistical testing are discussed. Related to the Bayesian belief network modeling, the relation between the assessment approach and the sources of evidence, the relation between qualitative evidence and quantitative evidence, and how to consider qualitative evidence are discussed. Related to the statistical testing, the need of the consideration of context-specific software failure probabilities and the inability to perform a huge number of tests in the real world are discussed. The discussions in this paper are expected to provide a common basis for future discussions on the reliability assessment of safety-critical software.
Microphone array power ratio for quality assessment of reverberated speech
Berkun, Reuven; Cohen, Israel
2015-12-01
Speech signals in enclosed environments are often distorted by reverberation and noise. In speech communication systems with several randomly distributed microphones, involving a dynamic speaker and unknown source location, it is of great interest to monitor the perceived quality at each microphone and select the signal with the best quality. Most of existing approaches for quality estimation require prior information or a clean reference signal, which is unfortunately seldom available. In this paper, a practical non-intrusive method for quality assessment of reverberated speech signals is proposed. Using a statistical model of the reverberation process, we examine the energies as measured by unidirectional elements in a microphone array. By measuring the power ratio, we obtain a measure for the amount of reverberation in the received acoustic signals. This measure is then utilized to derive a blind estimation of the direct-to-reverberation energy ratio in the room. The proposed approach attains a simple, reliable, and robust quality measure, shown here through persuasive simulation results.
Data base of accident and agricultural statistics for transportation risk assessment
A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs
Data base of accident and agricultural statistics for transportation risk assessment
Saricks, C.L.; Williams, R.G.; Hopf, M.R.
1989-11-01
A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino
2016-01-01
To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.
Assessment and financing of electric power projects
The aim of the appraisal of a project is to examine the economic need which a project is designed to meet, to judge whether the project is likely to meet this need in an efficient way, and to conclude what conditions should be attached to eventual Bank financing. Bank involvement continues throughout the life of the project helping to ensure that each project is carried out at the least possible cost and that it makes the expected contribution to the country's development. This paper gives an idea about the origin, nature and functions of the World Bank Group, describes the criteria used by the Bank in its power project appraisals, discusses the Bank's views on nuclear power, and concludes with a review of past lending and probable future sources of financing of electrical expansion in the less developed countries. (orig./UA)
TIDAL POWER: Economic and Technological Assessment
Montllonch Araquistain, Tatiana
2010-01-01
At present time there is concern over global climate change, as well as a growing awareness on worldwide population about the need on reducing greenhouse gas emissions. This in fact, has led to an increase in power generation from renewable sources. Tidal energy has the potential to play a valuable role in a sustainable energy future. Its main advantage over other renewable sources is its predictability; tides can be predicted years in advanced. The energy extracted from the tides can come fr...
Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions
Pisarenko, V F
2011-01-01
We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent characterizing the null hypothesis. We demonstrate how to apply these two tests on the distributions of cities and of agglomerations in a number of countries. We find the following evidence for Dragon-Kings: London in the distribution of city sizes of Great Britain; Moscow and St-Petersburg in the distribution of city sizes in the Russian Federation; and Paris in the distribution of agglomeration sizes in France. True negatives are also reported, for instance the absence of Dragon-Kings in the distribution of cities in Ger...
Decision tree approach to power systems security assessment
Wehenkel, Louis; Pavella, Mania
1993-01-01
An overview of the general decision tree approach to power system security assessment is presented. The general decision tree methodology is outlined, modifications proposed in the context of transient stability assessment are embedded, and further refinements are considered. The approach is then suitably tailored to handle other specifics of power systems security, relating to both preventive and emergency voltage control, in addition to transient stability. Trees are accordingly built in th...
A statistical model for seismic hazard assessment of hydraulic-fracturing-induced seismicity
Hajati, T.; Langenbruch, C.; Shapiro, S. A.
2015-12-01
We analyze the interevent time distribution of hydraulic-fracturing-induced seismicity collected during 18 stages at four different regions. We identify a universal statistical process describing the distribution of hydraulic-fracturing-induced events in time. The distribution of waiting times between subsequently occurring events is given by the exponential probability density function of the homogeneous Poisson process. Our findings suggest that hydraulic-fracturing-induced seismicity is directly triggered by the relaxation of stress and pore pressure perturbation initially created by the injection. Therefore, compared to this relaxation, the stress transfer caused by the occurrence of preceding seismic events is mainly insignificant for the seismogenesis of subsequently occurring events. We develop a statistical model to compute the occurrence probability of hydraulic-fracturing-induced seismicity. This model can be used to assess the seismic hazard associated with hydraulic fracturing operations. No aftershock triggering has to be included in the statistical model.
Employment of kernel methods on wind turbine power performance assessment
Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.; Jensen, Bogi Bech; Mijatovic, Nenad; Holbøll, Joachim
2015-01-01
A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from the ke...
Transient stability risk assessment of power systems incorporating wind farms
Miao, Lu; Fang, Jiakun; Wen, Jinyu;
2013-01-01
Large-scale wind farm integration has brought several aspects of challenges to the transient stability of power systems. This paper focuses on the research of the transient stability of power systems incorporating with wind farms by utilizing risk assessment methods. The detailed model of double ...
Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems
Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte; Sørensen, Poul
2007-01-01
Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...
Windfarm Generation Assessment for Reliability Analysis of Power Systems
Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.; Sørensen, P.
2007-01-01
Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...
Windfarm generation assessment for reliability analysis of power systems
Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.; Sørensen, Poul Ejnar
2007-01-01
Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...
Statistical analysis of wind power in the region of Veracruz (Mexico)
Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)
2009-06-15
The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Arkadov, G V; Rodionov, A N
2012-01-01
Probabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Directing preventative maintenance, this title provides a comprehensive review of the theory and application of these methods.$bProbabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Successful calculation of the reliability and ageing of components is critical for forecasting safety and directing preventative maintenance, and Probabilistic safety assessment for optimum nuclear power plant life management provides a comprehensive review of the theory and application of these methods. Part one reviews probabilistic methods for predicting the reliability of equipment. Following an introduction to key terminology, concepts and definitions, formal-statistical and various physico-statistical approaches are discussed. Approaches based on the use of defect-free models are considered, along with those using binomial distribution and models bas...
Nuclear power plant performance statistics. Comparison with fossil-fired units
The joint UNIPEDE/World Energy Conference Committee on Availability of Thermal Generating Plants has a mandate to study the availability of thermal plants and the different factors that influence it. This has led to the collection and publication at the Congress of the World Energy Conference (WEC) every third year of availability and unavailability factors to be used in systems reliability studies and operations and maintenance planning. For nuclear power plants the joint UNIPEDE/WEC Committee relies on the IAEA to provide availability and unavailability data. The IAEA has published an annual report with operating data from nuclear plants in its Member States since 1971, covering in addition back data from the early 1960s. These reports have developed over the years and in the early 1970s the format was brought into close conformity with that used by UNIPEDE and WEC to report performance of fossil-fired generating plants. Since 1974 an annual analytical summary report has been prepared. In 1981 all information on operating experience with nuclear power plants was placed in a computer file for easier reference. The computerized Power Reactor Information System (PRIS) ensures that data are easily retrievable and at its present level it remains compatible with various national systems. The objectives for the IAEA data collection and evaluation have developed significantly since 1970. At first, the IAEA primarily wanted to enable the individual power plant operator to compare the performance of his own plant with that of others of the same type; when enough data had been collected, they provided the basis for assessment of the fundamental performance parameters used in economic project studies; now, the data base merits being used in setting availability objectives for power plant operations. (author)
Chu, Ying; Mou, Xuanqin; Fu, Hong; Ji, Zhen
2015-11-01
We present a general purpose blind image quality assessment (IQA) method using the statistical independence hidden in the joint distributions of divisive normalization transform (DNT) representations for natural images. The DNT simulates the redundancy reduction process of the human visual system and has good statistical independence for natural undistorted images; meanwhile, this statistical independence changes as the images suffer from distortion. Inspired by this, we investigate the changes in statistical independence between neighboring DNT outputs across the space and scale for distorted images and propose an independence uncertainty index as a blind IQA (BIQA) feature to measure the image changes. The extracted features are then fed into a regression model to predict the image quality. The proposed BIQA metric is called statistical independence (STAIND). We evaluated STAIND on five public databases: LIVE, CSIQ, TID2013, IRCCyN/IVC Art IQA, and intentionally blurred background images. The performances are relatively high for both single- and cross-database experiments. When compared with the state-of-the-art BIQA algorithms, as well as representative full-reference IQA metrics, such as SSIM, STAIND shows fairly good performance in terms of quality prediction accuracy, stability, robustness, and computational costs.
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.
Application of advanced statistical methods in assessment of the late phase of a nuclear accident
Hofman, Radek
Praha: ČVUT, 2008, s. 1-4. [Dny radiacni ochrany /30/.. Liptovsky Jan (SK), 10.11.2008-14.11.2008] R&D Projects: GA ČR(CZ) GA102/07/1596 Institutional research plan: CEZ:AV0Z10750506 Keywords : radiation protection Subject RIV: DI - Air Pollution ; Quality http://library.utia.cas.cz/separaty/2008/AS/hofman-application of advanced statistical methods in assessment of the late phase of a nuclear accident.pdf
Multivariable statistical process control to situation assessment of a sequencing batch reactor
Ruiz Ordóñez, Magda; Colomer, Joan; Colprim, Jesus; Meléndez, Joaquim
2004-01-01
In this work, a combination between Multivariate Statistical Process Control (MSPC) and an automatic classification algorithm is developed to application in Waste Water Treatment Plant. Multiway Principal Component Analysis is used as MSPC method. The goal is to create a model that describes the batch direction and helps to fix the limits used to determine abnormal situations. Then, an automatic classification algorithm is used to situation assessment of the process.
Statistical issues in the assessment of health outcomes in children : methodological review.
Lancaster, Gillian
2009-01-01
The lack of outcome measures that are validated for use on children limits the effectiveness and generalizability of paediatric health care interventions. Statistical epidemiology is a broad concept encompassing a wide range of useful techniques for use in child health outcome assessment and development. However, the range of techniques that are available is often confusing and prohibits their adoption. In the paper an overview of methodology is provided within the paediatric context. It is d...
A statistical assessment of population trends for data deficient Mexican amphibians
Esther Quintero; Thessen, Anne E.; Paulina Arias-Caballero; Bárbara Ayala-Orozco
2014-01-01
Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent stud...
Jiwen Ge; Guihua Ran; Wenjie Miao; Huafeng Cao; Shuyuan Wu; Lamei Cheng
2013-01-01
To provide the reasonable basis for scientific management of water resources and certain directive significance for sustaining health of Gufu River and even maintaining the stability of water ecosystem of the Three-Gorge Reservoir of Yangtze River, central China, multiple statistical methods including Cluster Analysis (CA), Discriminant Analysis (DA) and Principal Component Analysis (PCA) were performed to assess the spatial-temporal variations and interpret water quality data. The data were ...
Evaluation and assessment of nuclear power plant seismic methodology
The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology
Power source life cycle assessment by the Bilan Carbone method
Bilan Carbone is a method to assess the amount of spent energy in the form of CO2 formation and its impacts on climate change (carbon footprint). The method assesses each steps in power production, finds hidden energy flows for modelling future energy scenarios. The principles of the method are outlined and an example of its application is presented. (orig.)
Evaluation and assessment of nuclear power plant seismic methodology
Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.
1977-03-01
The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.
Amzal Billy
2011-02-01
Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.
Technology assessment Jordan Nuclear Power Plant Project
Preliminary regional analysis was carried out for identification of potential sites for NPP, followed by screening of these sites and selecting candidate sites. Aqaba sites are proposed, where it can use the sea water for cooling: i.Site 1; at the sea where it can use the sea water for direct cooling. ii.Site 2; 10 km to the east of Gulf of Aqaba shoreline at the Saudi Arabia borders. iii.Site 3, 4 km to the east of Gulf of Aqaba shoreline. Only the granitic basement in the east of the 6 km²site should be considered as a potential site for a NPP. Preliminary probabilistic seismic hazard assessment gives: Operating-Basis Earthquake-OBE (475 years return period) found to be in the range of 0.163-0.182 g; Safe Shutdown Earthquake-SSE (10,000 years return period) found to be in the range of 0.333-0.502g. The process include also setting up of nuclear company and other organizational matters. Regulations in development are: Site approval; Construction permitting; Overall licensing; Safety (design, construction, training, operations, QA); Emergency planning; Decommissioning; Spent fuel and RW management. JAEC's technology assessment strategy and evaluation methodology are presented
Hybrid algorithm for rotor angle security assessment in power systems
D. Prasad Wadduwage
2015-08-01
Full Text Available Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on the concept of Lyapunov exponents (LEs to determine the transient security of the post-fault power system. Later, the transient secure power swing curves are analysed using an improved Prony algorithm which extracts the dominant oscillatory modes and estimates their damping ratios. The damping ratio is a security measure about the oscillatory security of the post-fault power system subsequent to the contingency. The suitability of the proposed hybrid algorithm for DSA in power systems is illustrated using different contingencies of a 16-generator 68-bus test system and a 50-generator 470-bus test system. The accuracy of the stability conclusions and the acceptable computational burden indicate that the proposed hybrid algorithm is suitable for real-time security assessment with respect to both transient rotor angle stability and oscillatory rotor angle stability under multiple contingencies of the power system.
National-Scale Wind Resource Assessment for Power Generation (Presentation)
Baring-Gould, E. I.
2013-08-01
This presentation describes the current standards for conducting a national-scale wind resource assessment for power generation, along with the risk/benefit considerations to be considered when beginning a wind resource assessment. The presentation describes changes in turbine technology and viable wind deployment due to more modern turbine technology and taller towers and shows how the Philippines national wind resource assessment evolved over time to reflect changes that arise from updated technologies and taller towers.