WorldWideScience

Sample records for assessment statistical power

  1. Assessment and statistics of Brazilian hydroelectric power plants: Dam areas versus installed and firm power

    International Nuclear Information System (INIS)

    The Brazilian relief, predominantly composed by small mountains and plateaus, contributed to formation of rivers with high amount of falls. With exception to North-eastern Brazil, the climate of this country are rainy, which contributes to maintain water flows high. These elements are essential to a high hydroelectric potential, contributing to the choice of hydroelectric power plants as the main technology of electricity generation in Brazil. Though this is a renewable source, whose utilized resource is free, dams must to be established which generates a high environmental and social impact. The objective of this study is to evaluate the impact caused by these dams through the use of environmental indexes. These indexes are ratio formed by installed power with dam area of a hydro power plant, and ratio formed by firm power with this dam area. In this study, the greatest media values were found in South, Southeast, and Northeast regions respectively, and the smallest media values were found in North and Mid-West regions, respectively. The greatest encountered media indexes were also found in dams established in the 1950s. In the last six decades, the smallest indexes were registered by dams established in the 1980s. These indexes could be utilized as important instruments for environmental impact assessments, and could enable a dam to be established that depletes an ecosystem as less as possible. (author)

  2. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Directory of Open Access Journals (Sweden)

    Frantál Bohumil

    2016-03-01

    Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.

  3. DISTRIBUTED GRID-CONNECTED PHOTOVOLTAIC POWER SYSTEM EMISSION OFFSET ASSESSMENT: STATISTICAL TEST OF SIMULATED- AND MEASURED-BASED DATA

    Science.gov (United States)

    This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...

  4. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation stu

  5. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  6. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1991 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping af units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power'', September 1977, have been applied. (au)

  7. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1988 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  8. Power quality assessment

    International Nuclear Information System (INIS)

    The electrical power systems are exposed to different types of power quality disturbances problems. Assessment of power quality is necessary for maintaining accurate operation of sensitive equipment's especially for nuclear installations, it also ensures that unnecessary energy losses in a power system are kept at a minimum which lead to more profits. With advanced in technology growing of industrial / commercial facilities in many region. Power quality problems have been a major concern among engineers; particularly in an industrial environment, where there are many large-scale type of equipment. Thus, it would be useful to investigate and mitigate the power quality problems. Assessment of Power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. The choice of monitoring equipment in a survey is also important to ascertain a solution to these power quality problems. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards); then, if problems exist, recommendation of mitigation techniques must be considered. The main objective of the present work is to investigate and mitigate of power quality problems in nuclear installations. Normally electrical power is supplied to the installations via two sources to keep good reliability. Each source is designed to carry the full load. The Assessment of power quality was performed at the nuclear installations for both sources at different operation conditions. The thesis begins with a discussion of power quality definitions and the results of previous studies in power quality monitoring. The assessment determines that one source of electricity was deemed to have relatively good power quality; there were several disturbances, which exceeded the thresholds. Among of them are fifth harmonic, voltage swell, overvoltage and flicker. While the second

  9. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)

  10. Power and environmental assessment

    DEFF Research Database (Denmark)

    Cashmore, Matthew Asa; Richardson, Tim

    2013-01-01

    The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently...

  11. Availability statistics for thermal power plants 1992

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power'', September 1977, have been applied. (au)

  12. Statistical Performances of Resistive Active Power Splitter

    Science.gov (United States)

    Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul

    2016-03-01

    In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.

  13. Evaluating and Reporting Statistical Power in Counseling Research

    Science.gov (United States)

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  14. Practical Uses of Statistical Power in Business Research Studies.

    Science.gov (United States)

    Markowski, Edward P.; Markowski, Carol A.

    1999-01-01

    Proposes the use of statistical power subsequent to the results of hypothesis testing in business research. Describes how posttest use of power might be integrated into business statistics courses. (SK)

  15. Statistical aspects of fish stock assessment

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte

    for stock assessment by application of state-of-the-art statistical methodology. The main contributions are presented in the form of six research papers. The major part of the thesis deals with age-structured assessment models, which is the most common approach. Conversion from length to age distributions...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods...... in the catches is a necessary step in age-based stock assessment models. For this purpose, GAMs and continuation ratio logits are combined to model the probability of age as a smooth function of length and spatial coordinates, which constitutes an improvement over traditional methods based on area...

  16. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  17. Assessing statistical significance in causal graphs

    Directory of Open Access Journals (Sweden)

    Chindelevitch Leonid

    2012-02-01

    Full Text Available Abstract Background Causal graphs are an increasingly popular tool for the analysis of biological datasets. In particular, signed causal graphs--directed graphs whose edges additionally have a sign denoting upregulation or downregulation--can be used to model regulatory networks within a cell. Such models allow prediction of downstream effects of regulation of biological entities; conversely, they also enable inference of causative agents behind observed expression changes. However, due to their complex nature, signed causal graph models present special challenges with respect to assessing statistical significance. In this paper we frame and solve two fundamental computational problems that arise in practice when computing appropriate null distributions for hypothesis testing. Results First, we show how to compute a p-value for agreement between observed and model-predicted classifications of gene transcripts as upregulated, downregulated, or neither. Specifically, how likely are the classifications to agree to the same extent under the null distribution of the observed classification being randomized? This problem, which we call "Ternary Dot Product Distribution" owing to its mathematical form, can be viewed as a generalization of Fisher's exact test to ternary variables. We present two computationally efficient algorithms for computing the Ternary Dot Product Distribution and investigate its combinatorial structure analytically and numerically to establish computational complexity bounds. Second, we develop an algorithm for efficiently performing random sampling of causal graphs. This enables p-value computation under a different, equally important null distribution obtained by randomizing the graph topology but keeping fixed its basic structure: connectedness and the positive and negative in- and out-degrees of each vertex. We provide an algorithm for sampling a graph from this distribution uniformly at random. We also highlight theoretical

  18. The Role of Atmospheric Measurements in Wind Power Statistical Models

    Science.gov (United States)

    Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.

    2015-12-01

    The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.

  19. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  20. When Mathematics and Statistics Collide in Assessment Tasks

    Science.gov (United States)

    Bargagliotti, Anna; Groth, Randall

    2016-01-01

    Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.

  1. Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models

    Science.gov (United States)

    Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles

    2012-01-01

    This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…

  2. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    Science.gov (United States)

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…

  3. The Power and Robustness of Maximum LOD Score Statistics

    OpenAIRE

    YOO, Y. J.; MENDELL, N.R.

    2008-01-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value.

  4. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  5. Statistical tests for power-law cross-correlated processes

    Science.gov (United States)

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.

  6. Statistical tests for power-law cross-correlated processes.

    Science.gov (United States)

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρ(DCCA)(T,n), where T is the total length of the time series and n the window size. For ρ(DCCA)(T,n), we numerically calculated the Cauchy inequality -1 ≤ ρ(DCCA)(T,n) ≤ 1. Here we derive -1 ≤ ρ DCCA)(T,n) ≤ 1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρ(DCCA) within which the cross-correlations become statistically significant. For overlapping windows we numerically determine-and for nonoverlapping windows we derive--that the standard deviation of ρ(DCCA)(T,n) tends with increasing T to 1/T. Using ρ(DCCA)(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series. PMID:22304166

  7. Statistical aspects of environmental risk assessment of GM plants for effects on non-target organisms

    Science.gov (United States)

    Previous European guidance for environmental risk assessment of genetically-modified plants emphasized the concepts of statistical power but provided no explicit requirements for the provision of statistical power analyses. Similarly, whilst the need for good experimental designs was stressed, no m...

  8. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  9. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  10. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  11. Replication unreliability in psychology: elusive phenomena or elusive statistical power?

    Directory of Open Access Journals (Sweden)

    Patrizio E Tressoldi

    2012-07-01

    Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.

  12. Using Tree Diagrams as an Assessment Tool in Statistics Education

    Science.gov (United States)

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  13. New Dynamical-Statistical Techniques for Wind Power Prediction

    Science.gov (United States)

    Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.

    2012-04-01

    The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.

  14. Statistical method for scientific projects risk assessment

    OpenAIRE

    Бедрій, Дмитро Іванович

    2013-01-01

    This article discusses the use of statistical methods for risk evaluation of the scientific institutions activity in the public sector of the Ukrainian economy in the process of planning and execution of scientific projects, some of the results of our research in this area are presented. The main objective of the study is to determine the possibility of using the statistical method in the process of evaluation of the research projects risks. The use of risk evaluation methods allows the manag...

  15. Teaching, Learning and Assessing Statistical Problem Solving

    Science.gov (United States)

    Marriott, John; Davies, Neville; Gibson, Liz

    2009-01-01

    In this paper we report the results from a major UK government-funded project, started in 2005, to review statistics and handling data within the school mathematics curriculum for students up to age 16. As a result of a survey of teachers we developed new teaching materials that explicitly use a problem-solving approach for the teaching and…

  16. Power Curve Modeling in Complex Terrain Using Statistical Models

    Science.gov (United States)

    Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.

    2014-12-01

    Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.

  17. Power laws statistics of cliff failures, scaling and percolation

    CERN Document Server

    Baldassarri, Andrea

    2014-01-01

    The size of large cliff failures may be described in several ways, for instance considering the horizontal eroded area at the cliff top and the maximum local retreat of the coastline. Field studies suggest that, for large failures, the frequencies of these two quantities decrease as power laws of the respective magnitudes, defining two different decay exponents. Moreover, the horizontal area increases as a power law of the maximum local retreat, identifying a third exponent. Such observation suggests that the geometry of cliff failures are statistically similar for different magnitudes. Power laws are familiar in the physics of critical systems. The corresponding exponents satisfy precise relations and are proven to be universal features, common to very different systems. Following the approach typical of statistical physics, we propose a "scaling hypothesis" resulting in a relation between the three above exponents: there is a precise, mathematical relation between the distributions of magnitudes of erosion ...

  18. Robust Statistical Detection of Power-Law Cross-Correlation

    Science.gov (United States)

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-06-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  19. "Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"

    Science.gov (United States)

    Konstantopoulos, Spyros

    2009-01-01

    Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…

  20. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  1. Statistical analyses support power law distributions found in neuronal avalanches.

    Directory of Open Access Journals (Sweden)

    Andreas Klaus

    Full Text Available The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii model parameter estimation to determine the specific exponent of the power law, and (iii comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect. This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

  2. Statistical analysis of cascading failures in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Laboratory; Pfitzner, Rene [Los Alamos National Laboratory; Turitsyn, Konstantin [Los Alamos National Laboratory

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.

  3. Statistical quality assessment of a fingerprint

    Science.gov (United States)

    Hwang, Kyungtae

    2004-08-01

    The quality of a fingerprint is essential to the performance of AFIS (Automatic Fingerprint Identification System). Such a quality may be classified by clarity and regularity of ridge-valley structures.1,2 One may calculate thickness of ridge and valley to measure the clarity and regularity. However, calculating a thickness is not feasible in a poor quality image, especially, severely damaged images that contain broken ridges (or valleys). In order to overcome such a difficulty, the proposed approach employs the statistical properties in a local block, which involve the mean and spread of the thickness of both ridge and valley. The mean value is used for determining whether a fingerprint is wet or dry. For example, the black pixels are dominant if a fingerprint is wet, the average thickness of ridge is larger than one of valley, and vice versa on a dry fingerprint. In addition, a standard deviation is used for determining severity of damage. In this study, the quality is divided into three categories based on two statistical properties mentioned above: wet, good, and dry. The number of low quality blocks is used to measure a global quality of fingerprint. In addition, a distribution of poor blocks is also measured using Euclidean distances between groups of poor blocks. With this scheme, locally condensed poor blocks decreases the overall quality of an image. Experimental results on the fingerprint images captured by optical devices as well as by a rolling method show the wet and dry parts of image were successfully captured. Enhancing an image by employing morphology techniques that modifying the detected poor quality blocks is illustrated in section 3. However, more work needs to be done on designing a scheme to incorporate the number of poor blocks and their distributions for a global quality.

  4. Enrichment of statistical power for genome-wide association studies

    OpenAIRE

    Li, Meng; Liu, Xiaolei; Bradbury, Peter; Yu, Jianming; Zhang, Yuan-Ming; Todhunter, Rory J.; Buckler, Edward S; Zhang, Zhiwu

    2014-01-01

    Background The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most flexible and powerful for controlling population structure and individual unequal relatedness (kinship), the two common causes of spurious associations. The introduction of the compressed ...

  5. Assessment of alternatives to correct inventory difference statistical treatment deficiencies

    International Nuclear Information System (INIS)

    This document presents an analysis of alternatives to correct deficiencies in the statistical treatment of inventory differences in the NRC guidance documents and licensee practice. Pacific Northwest Laboratory's objective for this study was to assess alternatives developed by the NRC and a panel of safeguards statistical experts. Criteria were developed for the evaluation and the assessment was made considering the criteria. The results of this assessment are PNL recommendations, which are intended to provide NRC decision makers with a logical and statistically sound basis for correcting the deficiencies

  6. Cross-Cultural Instrument Translation: Assessment, Translation, and Statistical Applications

    Science.gov (United States)

    Mason, Teresa Crowe

    2005-01-01

    This article has four major sections: (a) general issues of assessment; (b) assessment of ethnic-group members, including those who are deaf; (c) translation of assessment tools, emphasizing translation into American Sign Language (ASL); and (d) statistical applications for translated instruments. The purpose of the article is to provide insight…

  7. Statistics of the Sunyaev-Zel'dovich Effect power spectrum

    CERN Document Server

    Peel, Michael W; Kay, Scott T

    2009-01-01

    Using large numbers of simulations of the microwave sky, incorporating the Cosmic Microwave Background (CMB) and the Sunyaev-Zel'dovich (SZ) effect due to clusters, we investigate the statistics of the power spectrum at microwave frequencies between spherical multipoles of 1000 and 10000. From these virtual sky maps, we find that the spectrum of the SZ effect has a larger standard deviation by a factor of 3 than would be expected from purely Gaussian realizations, and has a distribution that is significantly skewed towards higher values, especially when small map sizes are used. The standard deviation is also increased by around 10 percent compared to the trispectrum calculation due to the clustering of galaxy clusters. We also consider the effects of including residual point sources and uncertainties in the gas physics. This has implications for the excess power measured in the CMB power spectrum by the Cosmic Background Imager and BIMA experiments. Our results indicate that the observed excess could be expl...

  8. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  9. Quality Assessment and Improvement Methods in Statistics – what Works?

    Directory of Open Access Journals (Sweden)

    Hans Viggo Sæbø

    2014-12-01

    Full Text Available Several methods for quality assessment and assurance in statistics have been developed in a European context. Data Quality Assessment Methods (DatQAM were considered in a Eurostat handbook in 2007. These methods comprise quality reports and indicators, measurement of process variables, user surveys, self-assessments, audits, labelling and certifi cation. The entry point for the paper is the development of systematic quality work in European statistics with regard to good practices such as those described in the DatQAM handbook. Assessment is one issue, following up recommendations and implementation of improvement actions another. This leads to a discussion on the eff ect of approaches and tools: Which work well, which have turned out to be more of a challenge, and why? Examples are mainly from Statistics Norway, but these are believed to be representative for several statistical institutes.

  10. Model of risk assessment under ballistic statistical tests

    Science.gov (United States)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  11. Visual and Statistical Analysis of Digital Elevation Models Generated Using Idw Interpolator with Varying Powers

    Science.gov (United States)

    Asal, F. F.

    2012-07-01

    Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical

  12. Robust Statistical Detection of Power-Law Cross-Correlation.

    Science.gov (United States)

    Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert

    2016-01-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630

  13. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  14. Demographic statistics pertaining to nuclear power reactor sites

    International Nuclear Information System (INIS)

    Population statistics are presented for 145 nuclear power plant sites. Summary tables and figures are included that were developed to aid in the evaluation of trends and general patterns associated with the various parameters of interest, such as the proximity of nuclear plant sites to centers of population. The primary reason for publishing this information at this time is to provide a factual basis for use in discussions on the subject of reactor siting policy. The report is a revised and updated version of a draft report published in December 1977. Errors in the population data base have been corrected and new data tabulations added

  15. APPLICATION OF THE UNIFIED STATISTICAL MATERIAL DATABASE FOR DESIGN AND LIFE/RISK ASSESSMENT OF HIGH TEMPERATURE COMPONENTS

    Institute of Scientific and Technical Information of China (English)

    K.Fujiyama; T.Fujiwara; Y.Nakatani; K.Saito; A.Sakuma; Y.Akikuni; S.Hayashi; S.Matsumoto

    2004-01-01

    Statistical manipulation of material data was conducted for probabilistic life assessment or risk-based design and maintenance for high temperature components of power plants. To obtain the statistical distribution of material properties, dominant parameters affecting material properties are introduced into normalization of statistical variables. Those parameters are hardness, chemical composition, characteristic microstructural features and so on. Creep and fatigue properties are expressed by normalized parameters and the unified statistical distributions are obtained. These probability distribution functions show good coincidence statistically with the field database of steam turbine components. It was concluded that the unified statistical baseline approach is useful for the risk management of components in power plants.

  16. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  17. HVDC power transmission technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, R.L.; Tatro, P.J.; Railing, B.D. [New England Power Service Co., Westborough, MA (United States); Johnson, B.K.; Stewart, J.R. [Power Technologies, Inc., Schenectady, NY (United States); Fink, J.L.

    1997-04-01

    The purpose of this study was to develop an assessment of the national utility system`s needs for electric transmission during the period 1995-2020 that could be met by future reduced-cost HVDC systems. The assessment was to include an economic evaluation of HVDC as a means for meeting those needs as well as a comparison with competing technologies such as ac transmission with and without Flexible AC Transmission System (FACTS) controllers. The role of force commutated dc converters was to be assumed where appropriate. The assessment begins by identifying the general needs for transmission in the U.S. in the context of a future deregulated power industry. The possible roles for direct current transmission are then postulated in terms of representative scenarios. A few of the scenarios are illustrated with the help of actual U.S. system examples. non-traditional applications as well as traditional applications such as long lines and asynchronous interconnections are discussed. The classical ``break-even distance`` concept for comparing HVDC and ac lines is used to assess the selected scenarios. The impact of reduced-cost converters is reflected in terms of the break-even distance. This report presents a comprehensive review of the functional benefits of HVDC transmission and updated cost data for both ac and dc system components. It also provides some provocative thoughts on how direct current transmission might be applied to better utilize and expand our nation`s increasingly stressed transmission assets.

  18. Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows

    DEFF Research Database (Denmark)

    Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco;

    2012-01-01

    In view of the increasing penetration of wind power in a number of power systems and markets worldwide, we discuss some of the impacts that wind energy may have on market quantities and cross-border power flows. These impacts are uncovered through statistical analyses of actual market and flow data...... in Europe. Due to the dimensionality and nonlinearity of these effects, the necessary concepts of dimension reduction using Principal Component Analysis (PCA), as well as nonlinear regression are described. Example application results are given for European cross-border flows, as well as for the impact...... of load and wind power forecasts on Danish and German electricity markets....

  19. The role of previous experience and attitudes toward statistics in statistics assessment outcomes among undergraduate psychology students

    OpenAIRE

    Dempster, Martin; McCorry, Noleen

    2009-01-01

    Previous research has demonstrated that students’ cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students’ previous experiences of maths, statistics and computing; their attitudes toward statistics; and assessment on a statistics course. Of the variables examined, the strongest predictor of assessment outcome was students’ attitude about their in...

  20. Toward improved statistical treatments of wind power forecast errors

    Science.gov (United States)

    Hart, E.; Jacobson, M. Z.

    2011-12-01

    The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.

  1. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  2. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S.; Antoniou, I.; Dahlberg, J.A. [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  3. Prediction of lacking control power in power plants using statistical models

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob

    2007-01-01

    Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance of...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...

  4. Alternative Assessment in Higher Education: An Experience in Descriptive Statistics

    Science.gov (United States)

    Libman, Zipora

    2010-01-01

    Assessment-led reform is now one of the most widely favored strategies to promote higher standards of teaching, more powerful learning and more credible forms of public accountability. Within this context of change, higher education in many countries is increasingly subjected to demands to implement alternative assessment strategies that provide…

  5. Earthquake accelerogram simulation with statistical law of evolutionary power spectrum

    Institute of Scientific and Technical Information of China (English)

    ZHANG Cui-ran; CHEN Hou-qun; LI Min

    2007-01-01

    By using the technique for evolutionary power spectrum proposed by Nakayama and with reference to the Kameda formula, an evolutionary spectrum prediction model for given earthquake magnitude and distance is established based on the 80 near-source acceleration records at rock surface with large magnitude from the ground motion database of western U.S.. Then a new iteration method is developed for generation of random accelerograms non-stationary both in amplitude and frequency which are compatible with target evolutionary spectrum. The phase spectra of those simulated accelerograms are also non-stationary in time and frequency domains since the interaction between amplitude and phase angle has been considered during the generation. Furthermore, the sign of the phase spectrum increment is identified to accelerate the iteration. With the proposed statistical model for predicting evolutionary power spectra and the new method for generating compatible time history, the artificial random earthquake accelerograms non-stationary both in amplitude and frequency for certain magnitude and distance can be provided.

  6. Gene set analysis for GWAS: assessing the use of modified Kolmogorov-Smirnov statistics.

    Science.gov (United States)

    Debrabant, Birgit; Soerensen, Mette

    2014-10-01

    We discuss the use of modified Kolmogorov-Smirnov (KS) statistics in the context of gene set analysis and review corresponding null and alternative hypotheses. Especially, we show that, when enhancing the impact of highly significant genes in the calculation of the test statistic, the corresponding test can be considered to infer the classical self-contained null hypothesis. We use simulations to estimate the power for different kinds of alternatives, and to assess the impact of the weight parameter of the modified KS statistic on the power. Finally, we show the analogy between the weight parameter and the genesis and distribution of the gene-level statistics, and illustrate the effects of differential weighting in a real-life example.

  7. The effect of cluster size variability on statistical power in cluster-randomized trials.

    Directory of Open Access Journals (Sweden)

    Stephen A Lauer

    Full Text Available The frequency of cluster-randomized trials (CRTs in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power.

  8. The Role of Previous Experience and Attitudes toward Statistics in Statistics Assessment Outcomes among Undergraduate Psychology Students

    Science.gov (United States)

    Dempster, Martin; McCorry, Noleen K.

    2009-01-01

    Previous research has demonstrated that students' cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students' previous experiences of maths, statistics and computing; their attitudes toward statistics;…

  9. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  10. Statistical analysis about corrosion in nuclear power plants

    International Nuclear Information System (INIS)

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  11. The power and statistical behaviour of allele-sharing statistics when applied to models with two disease loci

    Indian Academy of Sciences (India)

    Yin Y. Shugart; Bing-Jian Feng; Andrew Collins

    2002-11-01

    We have evaluated the power for detecting a common trait determined by two loci, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single disease locus, we have used a simulated data set of general pedigrees in which a two-locus disease is segregating and evaluated several non-parametric linkage statistics implemented in the two programs. We found that the power for detecting linkage using the $S_{\\text{all}}$ statistic in GENEHUNTER (GH, version 2.1), implemented as statistic in SimWalk2 (version 2.82), is different in the two. The values associated with statistic output by SimWalk2 are consistently more conservative than those from GENEHUNTER except when the underlying model includes heterogeneity at a level of 50% where the values output are very comparable. On the other hand, when the thresholds are determined empirically under the null hypothesis, $S_{\\text{all}}$ in GENEHUNTER and statistic have similar power.

  12. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such...... tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators are...

  13. Assessing agreement with multiple raters on correlated kappa statistics.

    Science.gov (United States)

    Cao, Hongyuan; Sen, Pranab K; Peery, Anne F; Dellon, Evan S

    2016-07-01

    In clinical studies, it is often of interest to see the diagnostic agreement among clinicians on certain symptoms. Previous work has focused on the agreement between two clinicians under two different conditions or the agreement among multiple clinicians under one condition. Few have discussed the agreement study with a design where multiple clinicians examine the same group of patients under two different conditions. In this paper, we use the intraclass kappa statistic for assessing nominal scale agreement with such a design. We derive an explicit variance formula for the difference of correlated kappa statistics and conduct hypothesis testing for the equality of kappa statistics. Simulation studies show that the method performs well with realistic sample sizes and may be superior to a method that did not take into account the measurement dependence structure. The practical utility of the method is illustrated on data from an eosinophilic esophagitis (EoE) study.

  14. Efficient computation and statistical assessment of transfer entropy

    Directory of Open Access Journals (Sweden)

    Patrick eBoba

    2015-03-01

    Full Text Available The analysis of complex systems frequently poses the challenge to distinguish correlation from causation. Statistical physics hasinspired very promising approaches to search for correlations in time series; the transfer entropy in particular (Hlavackova-Schindler et al., 2007. Now, methods from computational statistics can quantitatively assign significance to such correlation measures. In this study, we propose and apply a procedure to statistically assess transfer entropies by one-sided tests. We introduce to null models of vanishing correlations for time series with memory.We implemented them in an OpenMP-based, parallelized C++ package for multi-core CPUs. Using template meta-programming, we enable a compromise between memory and run time efficiency.

  15. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  16. The Statistics Concept Inventory: Development and analysis of a cognitive assessment instrument in statistics

    Science.gov (United States)

    Allen, Kirk

    The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented

  17. Environmental Assessment for power marketing policy for Southwestern Power Administration

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described.

  18. Environmental Assessment for power marketing policy for Southwestern Power Administration

    International Nuclear Information System (INIS)

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described

  19. Statistical analysis of fire events at US nuclear power plants

    International Nuclear Information System (INIS)

    The concern about fires as a potential agent of common cause failure in NPPs has greatly increased since the Browns Ferry NPP fire. Several regulatory actions were initiated following this incident. In investigating the chances of fire incident leading to core melt it is found that the unconditional frequency is about 1x10 incidents per reactor-year. The detailed reviews of fire events at nuclear plants are used in quantifying fire occurrence frequency required to carry out fire risk assessment. In this work the results of a statistical analysis of 354 fire incidents at US NPPs in the period from January 1965 to June 1985 are presented to quantify fire occurrence frequency. The distribution of fire incidents between the different types of NPPs (PWR, BWR or HTGR), the mode of plant operation, the probable cause of fire, the type of detectors detect the incident, who extinguished the fire, suppression equipment, suppression agent, the initiating combustible, the component or components affected by fire are all analysed for the studied 354 fire incidents. More than 50% of the incidents occurred during the construction phase, in many of them there is neither nuclear problem nor any safety problem, however these incidents delayed the startup of the units up to 2 years as happened in Indian Point unit 2 (1971). There are four major fire incidents at US NPPS in the first period of the study (1965-1978), not one of them in the last seven years (1979-1985) which clarify the development in the fire protection measures and technology. The fire events in US (NPPS) can be summarized in about 354 incidents at 33 locations due to 38 causes of fire with 0.17 fire events/plant/year

  20. The number of Guttman errors as a simple and powerful person-fit statistic

    OpenAIRE

    Meijer, Rob R.

    1994-01-01

    A number of studies have examined the power of several statistics that can be used to detect examinees with unexpected (nonfitting) item score patterns, or to determine person fit. This study compared the power of the U3 statistic with the power of one of the simplest person-fit statistics, the sum of the number of Guttman errors. In most cases studied, (a weighted version of) the latter statistic performed as well as the U3 statistic. Counting the number of Guttman errors seems to be a usefu...

  1. Combining heuristic and statistical techniques in landslide hazard assessments

    Science.gov (United States)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  2. Comparisons of power of statistical methods for gene-environment interaction analyses.

    Science.gov (United States)

    Ege, Markus J; Strachan, David P

    2013-10-01

    Any genome-wide analysis is hampered by reduced statistical power due to multiple comparisons. This is particularly true for interaction analyses, which have lower statistical power than analyses of associations. To assess gene-environment interactions in population settings we have recently proposed a statistical method based on a modified two-step approach, where first genetic loci are selected by their associations with disease and environment, respectively, and subsequently tested for interactions. We have simulated various data sets resembling real world scenarios and compared single-step and two-step approaches with respect to true positive rate (TPR) in 486 scenarios and (study-wide) false positive rate (FPR) in 252 scenarios. Our simulations confirmed that in all two-step methods the two steps are not correlated. In terms of TPR, two-step approaches combining information on gene-disease association and gene-environment association in the first step were superior to all other methods, while preserving a low FPR in over 250 million simulations under the null hypothesis. Our weighted modification yielded the highest power across various degrees of gene-environment association in the controls. An optimal threshold for step 1 depended on the interacting allele frequency and the disease prevalence. In all scenarios, the least powerful method was to proceed directly to an unbiased full interaction model, applying conventional genome-wide significance thresholds. This simulation study confirms the practical advantage of two-step approaches to interaction testing over more conventional one-step designs, at least in the context of dichotomous disease outcomes and other parameters that might apply in real-world settings.

  3. Statistical assessment of quality of credit activity of Ukrainian banks

    Directory of Open Access Journals (Sweden)

    Moldavska Olena V.

    2013-03-01

    Full Text Available The article conducts an economic and statistical analysis of the modern state of credit activity of Ukrainian banks and main tendencies of its development. It justifies urgency of the statistical study of credit activity of banks. It offers a complex system of assessment of bank lending at two levels: the level of the banking system and the level of an individual bank. The use of the system analysis allows reflection of interconnection between effectiveness of functioning of the banking system and quality of the credit portfolio. The article considers main aspects of management of quality of the credit portfolio – level of troubled debt and credit risk. The article touches the problem of adequate quantitative assessment of troubled loans in the credit portfolios of banks, since the methodologies of its calculation used by the National Bank of Ukraine and international rating agencies are quite different. The article presents a system of methods of management of credit risk, both theoretically and providing specific examples, in the context of prevention of occurrence of risk situations or elimination of their consequences.

  4. Enrichment of statistical power for genome-wide association studies

    Science.gov (United States)

    The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...

  5. Comparative environmental assessment of unconventional power installations

    Science.gov (United States)

    Sosnina, E. N.; Masleeva, O. V.; Kryukov, E. V.

    2015-08-01

    Procedure of the strategic environmental assessment of the power installations operating on the basis of renewable energy sources (RES) was developed and described. This procedure takes into account not only the operational process of the power installation but also the whole life cycles: from the production and distribution of power resources for manufacturing of the power installations to the process of their recovery. Such an approach gives an opportunity to make a more comprehensive assessment of the influence of the power installations on environments and may be used during adaptation of the current regulations and development of new regulations for application of different types of unconventional power installations with due account of the ecological factor. Application of the procedure of the integrated environmental assessment in the context of mini-HPP (Hydro Power Plant); wind, solar, and biogas power installations; and traditional power installation operating natural gas was considered. Comparison of environmental influence revealed advantages of new energy technologies compared to traditional ones. It is shown that solar energy installations hardly pollute the environment during operation, but the negative influence of the mining operations and manufacturing and utilization of the materials used for solar modules is maximum. Biogas power installations are on the second place as concerns the impact on the environment due to the considerable mass of the biogas installation and gas reciprocating engine. The minimum impact on the environment is exerted by the mini-HPP. Consumption of material and energy resources for the production of the traditional power installation is less compared to power installations on RES; however, this factor incomparably increases when taking into account the fuel extraction and transfer. The greatest impact on the environment is exerted by the operational process of the traditional power installations.

  6. Afterglow Light Curves and Broken Power Laws: A Statistical Study

    CERN Document Server

    J'ohannesson, G; Gudmundsson, E H; J\\'ohannesson, Gudlaugur; Bj\\"ornsson, Gunnlaugur; Gudmundsson, Einar H.

    2006-01-01

    In gamma-ray burst research it is quite common to fit the afterglow light curves with a broken power law to interpret the data. We apply this method to a computer simulated population of afterglows and find systematic differences between the known model parameters of the population and the ones derived from the power law fits. In general, the slope of the electron energy distribution is overestimated from the pre-break light curve slope while being underestimated from the post-break slope. We also find that the jet opening angle derived from the fits is overestimated in narrow jets and underestimated in wider ones. Results from fitting afterglow light curves with broken power laws must therefore be interpreted with caution since the uncertainties in the derived parameters might be larger than estimated from the fit. This may have implications for Hubble diagrams constructed using gamma-ray burst data.

  7. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  8. Notes on the Statistical Power of the Binary State Speciation and Extinction (BiSSE) Model.

    Science.gov (United States)

    Gamisch, Alexander

    2016-01-01

    The Binary State Speciation and Extinction (BiSSE) method is one of the most popular tools for investigating the rates of diversification and character evolution. Yet, based on previous simulation studies, it is commonly held that the BiSSE method requires phylogenetic trees of fairly large sample sizes (>300 taxa) in order to distinguish between the different models of speciation, extinction, or transition rate asymmetry. Here, the power of the BiSSE method is reevaluated by simulating trees of both small and large sample sizes (30, 60, 90, and 300 taxa) under various asymmetry models and root state assumptions. Results show that the power of the BiSSE method can be much higher, also in trees of small sample size, for detecting differences in speciation rate asymmetry than anticipated earlier. This, however, is not a consequence of any conceptual or mathematical flaw in the method per se but rather of assumptions about the character state at the root of the simulated trees and thus the underlying macroevolutionary model, which led to biased results and conclusions in earlier power assessments. As such, these earlier simulation studies used to determine the power of BiSSE were not incorrect but biased, leading to an overestimation of type-II statistical error for detecting differences in speciation rate but not for extinction and transition rates. PMID:27486297

  9. Mathematical Power: Exploring Critical Pedagogy in Mathematics and Statistics

    Science.gov (United States)

    Lesser, Lawrence M.; Blake, Sally

    2007-01-01

    Though traditionally viewed as value-free, mathematics is actually one of the most powerful, yet underutilized, venues for working towards the goals of critical pedagogy--social, political and economic justice for all. This emerging awareness is due to how critical mathematics educators such as Frankenstein, Skovsmose and Gutstein have applied the…

  10. Near and Far from Equilibrium Power-Law Statistics

    CERN Document Server

    Biro, Tamas S; Biro, Gabor; Shen, Ke Ming

    2016-01-01

    We analyze the connection between $p_T$ and multiplicity distributions in a statistical framework. We connect the Tsallis parameters, $T$ and $q$, to physical properties like average energy per particle and the second scaled factorial moment, $F_2=\\langle n(n-1) \\rangle / {\\langle n \\rangle}^2$, measured in multiplicity distributions. Near and far from equilibrium scenarios with master equations for the probability of having $n$ particles, $P_n$, are reviewed based on hadronization transition rates, $\\mu_n$, from $n$ to $n+1$ particles.

  11. Safety assessment of emergency power systems for nuclear power plants

    International Nuclear Information System (INIS)

    This publication is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing the safety of a given design of the emergency power systems (EPS) for a nuclear power plant. The present publication refers closely to the NUSS Safety Guide 50-SG-D7 (Rev. 1), Emergency Power Systems at Nuclear Power Plants. It covers therefore exactly the same technical subject as that Safety Guide. In view of its objective, however, it attempts to help in the evaluation of possible technical solutions which are intended to fulfill the safety requirements. Section 2 clarifies the scope further by giving an outline of the assessment steps in the licensing process. After a general outline of the assessment process in relation to the licensing of a nuclear power plant, the publication is divided into two parts. First, all safety issues are presented in the form of questions that have to be answered in order for the assessor to be confident of a safe design. The second part presents the same topics in tabulated form, listing the required documentation which the assessor has to consult and those international and national technical standards pertinent to the topics. An extensive reference list provides information on standards. 1 tab

  12. Thermohydraulic assessment of the RP-10 reactor core to determine the maximum power

    International Nuclear Information System (INIS)

    Thermohydraulic parameters assessment of the RP-10 reactor core from the most thermally demanded (hot channel). Determination of the operation thermal maximum power considering security margins and statistical treatment of uncertainty factors

  13. Statistical Power of Psychological Research: What Have We Gained in 20 Years?

    Science.gov (United States)

    Rossi, Joseph S.

    1990-01-01

    Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…

  14. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  15. Estimating statistical power for open-enrollment group treatment trials.

    Science.gov (United States)

    Morgan-Lopez, Antonio A; Saavedra, Lissette M; Hien, Denise A; Fals-Stewart, William

    2011-01-01

    Modeling turnover in group membership has been identified as a key barrier contributing to a disconnect between the manner in which behavioral treatment is conducted (open-enrollment groups) and the designs of substance abuse treatment trials (closed-enrollment groups, individual therapy). Latent class pattern mixture models (LCPMMs) are emerging tools for modeling data from open-enrollment groups with membership turnover in recently proposed treatment trials. The current article illustrates an approach to conducting power analyses for open-enrollment designs based on the Monte Carlo simulation of LCPMM models using parameters derived from published data from a randomized controlled trial comparing Seeking Safety to a Community Care condition for women presenting with comorbid posttraumatic stress disorder and substance use disorders. The example addresses discrepancies between the analysis framework assumed in power analyses of many recently proposed open-enrollment trials and the proposed use of LCPMM for data analysis. PMID:20832971

  16. Power-law distributions in economics: a nonextensive statistical approach

    CERN Document Server

    Queiros, S M D; Tsallis, C; Queiros, Silvio M. Duarte; Anteneodo, Celia; Tsallis, Constantino

    2005-01-01

    The cornerstone of Boltzmann-Gibbs ($BG$) statistical mechanics is the Boltzmann-Gibbs-Jaynes-Shannon entropy $S_{BG} \\equiv -k\\int dx f(x)\\ln f(x)$, where $k$ is a positive constant and $f(x)$ a probability density function. This theory has exibited, along more than one century, great success in the treatment of systems where short spatio/temporal correlations dominate. There are, however, anomalous natural and artificial systems that violate the basic requirements for its applicability. Different physical entropies, other than the standard one, appear to be necessary in order to satisfactorily deal with such anomalies. One of such entropies is $S_q \\equiv k (1-\\int dx [f(x)]^q)/(1-q)$ (with $S_1=S_{BG}$), where the entropic index $q$ is a real parameter. It has been proposed as the basis for a generalization, referred to as {\\it nonextensive statistical mechanics}, of the $BG$ theory. $S_q$ shares with $S_{BG}$ four remarkable properties, namely {\\it concavity} ($\\forall q>0$), {\\it Lesche-stability} ($\\for...

  17. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    Science.gov (United States)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  18. Efficiency statistics at all times: Carnot limit at finite power.

    Science.gov (United States)

    Polettini, M; Verley, G; Esposito, M

    2015-02-01

    We derive the statistics of the efficiency under the assumption that thermodynamic fluxes fluctuate with normal law, parametrizing it in terms of time, macroscopic efficiency, and a coupling parameter ζ. It has a peculiar behavior: no moments, one sub-, and one super-Carnot maxima corresponding to reverse operating regimes (engine or pump), the most probable efficiency decreasing in time. The limit ζ→0 where the Carnot bound can be saturated gives rise to two extreme situations, one where the machine works at its macroscopic efficiency, with Carnot limit corresponding to no entropy production, and one where for a transient time scaling like 1/ζ microscopic fluctuations are enhanced in such a way that the most probable efficiency approaches the Carnot limit at finite entropy production.

  19. Statistical modeling and analysis of the influence of antenna polarization error on received power

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The problem of statistical modeling of antenna polarization error is studied and the statistical characteristics of antenna's received power are analyzed. A novel Stokes-vector-based method is presented to describe the conception of antenna's polarization purity. Statistical model of antenna's polarization error in polarization domain is then built up. When an antenna with polarization error of uniform distribution is illuminated by an arbitrary polarized incident field, the probability density of antenna's received power is derived analytically. Finally, a group of curves of deviation and standard deviation of received power are plotted numerically.

  20. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  1. Ground assessment methods for nuclear power plant

    International Nuclear Information System (INIS)

    It is needless to say that nuclear power plant must be constructed on the most stable and safe ground. Reliable assessment method is required for the purpose. The Ground Integrity Sub-committee of the Committee of Civil Engineering of Nuclear Power Plant started five working groups, the purpose of which is to systematize the assessment procedures including geological survey, ground examination and construction design. The works of working groups are to establishing assessment method of activities of faults, standardizing the rock classification method, standardizing assessment and indication method of ground properties, standardizing test methods and establishing the application standard for design and construction. Flow diagrams for the procedures of geological survey, for the investigation on fault activities and ground properties of area where nuclear reactor and important outdoor equipments are scheduled to construct, were established. And further, flow diagrams for applying investigated results to design and construction of plant, and for determining procedure of liquidification nature of ground etc. were also established. These systematized and standardized methods of investigation are expected to yield reliable data for assessment of construction site of nuclear power plant and lead to the safety of construction and operation in the future. In addition, the execution of these systematized and detailed preliminary investigation for determining the construction site of nuclear power plant will make much contribution for obtaining nation-wide understanding and faith for the project. (Ishimitsu, A.)

  2. An assessment of recently published gene expression data analyses: reporting experimental design and statistical factors

    Directory of Open Access Journals (Sweden)

    Azuaje Francisco

    2006-06-01

    Full Text Available Abstract Background The analysis of large-scale gene expression data is a fundamental approach to functional genomics and the identification of potential drug targets. Results derived from such studies cannot be trusted unless they are adequately designed and reported. The purpose of this study is to assess current practices on the reporting of experimental design and statistical analyses in gene expression-based studies. Methods We reviewed hundreds of MEDLINE-indexed papers involving gene expression data analysis, which were published between 2003 and 2005. These papers were examined on the basis of their reporting of several factors, such as sample size, statistical power and software availability. Results Among the examined papers, we concentrated on 293 papers consisting of applications and new methodologies. These papers did not report approaches to sample size and statistical power estimation. Explicit statements on data transformation and descriptions of the normalisation techniques applied prior to data analyses (e.g. classification were not reported in 57 (37.5% and 104 (68.4% of the methodology papers respectively. With regard to papers presenting biomedical-relevant applications, 41(29.1 % of these papers did not report on data normalisation and 83 (58.9% did not describe the normalisation technique applied. Clustering-based analysis, the t-test and ANOVA represent the most widely applied techniques in microarray data analysis. But remarkably, only 5 (3.5% of the application papers included statements or references to assumption about variance homogeneity for the application of the t-test and ANOVA. There is still a need to promote the reporting of software packages applied or their availability. Conclusion Recently-published gene expression data analysis studies may lack key information required for properly assessing their design quality and potential impact. There is a need for more rigorous reporting of important experimental

  3. Statistical study of high energy radiation from rotation-powered pulsars

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on our self-consistent outer gap model for high energy emission from the rotation-powered pulsars, we study the statistical properties of X-ray and γ-ray emission from the rotation-powered pulsars, and other statistical properties (e.g. diffuse γ-ray background and unidentified γ-ray point sources) related to γ-ray pulsars in our Galaxy and nearby galaxies are also considered.

  4. Violation of statistical isotropy and homogeneity in the 21-cm power spectrum

    CERN Document Server

    Shiraishi, Maresuke; Kamionkowski, Marc; Raccanelli, Alvise

    2016-01-01

    Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic-Microwave-Background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm-line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving u...

  5. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    Science.gov (United States)

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  6. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    Science.gov (United States)

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  7. Effects of A Simulated Power Cut in AMS on Milk Yield Valued by Statistics Model

    Directory of Open Access Journals (Sweden)

    Anja Gräff

    2015-12-01

    Full Text Available A statistics model was developed in order to be able to determine the effects of a simulated power cut of an Automatic Milking System on the milk output.Measurable and relevant factors, such as power cuts, milk yield, lactation days, average two days digestion and rumination and time were considered in the calculation tool.

  8. On the power for linkage detection using a test based on scan statistics.

    Science.gov (United States)

    Hernández, Sonia; Siegmund, David O; de Gunst, Mathisca

    2005-04-01

    We analyze some aspects of scan statistics, which have been proposed to help for the detection of weak signals in genetic linkage analysis. We derive approximate expressions for the power of a test based on moving averages of the identity by descent allele sharing proportions for pairs of relatives at several contiguous markers. We confirm these approximate formulae by simulation. The results show that when there is a single trait-locus on a chromosome, the test based on the scan statistic is slightly less powerful than that based on the customary allele sharing statistic. On the other hand, if two genes having a moderate effect on a trait lie close to each other on the same chromosome, scan statistics improve power to detect linkage. PMID:15772104

  9. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  10. Statistical-Based Joint Power Control for Wireless Ad Hoc CDMA Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANGShu; RONGMongtian; CHENBo

    2005-01-01

    Current power control algorithm for CDMA-based ad hoc networks contains SIR and interference measurement, which is based on history information. However, for the traffics in today's or future's network, important statistical property is burstiness. As a consequence, the interference at a given receiving node may fluctuate dramatically. So the convergence speed of power control is not fast and the performance degrades. This paper presents a joint power control model. To a receiving node, all transmitting nodes assigned in same time slot adjust their transmitter power based on current information, which takes into account the adjustments of transmitter power of other transmitting nodes. Based on the joint power control model, this paper proposes a statisticalbased power control algorithm. Through this new algorithm, the interference is estimated more exactly. The simulation results indicated that the proposed power control algorithm outperforms the old algorithm.

  11. Nuclear power plant security assessment technical manual.

    Energy Technology Data Exchange (ETDEWEB)

    O' Connor, Sharon L.; Whitehead, Donnie Wayne; Potter, Claude S., III

    2007-09-01

    This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and

  12. Assessing Knowledge Structures in a Constructive Statistical Learning Environment

    OpenAIRE

    Verkoeijen, Peter; Imbos, Tj.; Van de Wiel, Margje; Berger, M.P.F.; Schmidt, Henk

    2002-01-01

    textabstractIn this report, the method of free recall is put forward as a tool to evaluate a prototypical statistical learning environment. A number of students from the faculty of Health Sciences, Maastricht University, the Netherlands, were required to write down whatever they could remember of a statistics course in which they had participated. By means of examining the free recall protocols of the participants, insight can be obtained into the mental representations they had formed with r...

  13. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  14. Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon

    2016-01-01

    The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including sample size, effect size of mediated effect, number of measurement occasions, and R2 of measured variables. In general, the results showed that relatively large samples were needed to accurately estimate the mediated effects and to have adequate statistical power, when testing mediation in the LGCM framework. Guidelines for designing studies to examine longitudinal mediation and ways to improve the accuracy of the estimates and statistical power were discussed.

  15. New statistic for financial return distributions: power-law or exponential?

    CERN Document Server

    Pisarenko, V F

    2004-01-01

    We introduce a new statistical tool (the TP-statistic and TE-statistic) designed specifically to compare the behavior of the sample tail of distributions with power-law and exponential tails as a function of the lower threshold u. One important property of these statistics is that they converge to zero for power laws or for exponentials correspondingly, regardless of the value of the exponent or of the form parameter. This is particularly useful for testing the structure of a distribution (power law or not, exponential or not) independently of the possibility of quantifying the values of the parameters. We apply these statistics to the distribution of returns of one century of daily data for the Dow Jones Industrial Average and over one year of 5-minutes data of the Nasdaq Composite index. Our analysis confirms previous works showing the tendency for the tails to resemble more and more a power law for the highest quantiles but we can detect clear deviations that suggest that the structure of the tails of the ...

  16. Geotechnical assessments of upgrading power transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Andrew [Coffey Geotechnics Ltd., Harrogate (United Kingdom)

    2012-11-01

    One of the consequences of increasing demand for energy is a corresponding requirement for increased energy distribution. This trend is likely to be magnified by the current tendency to generate power in locations remote from centres of population. New power transmission routes are expensive and awkward to develop, and there are therefore benefits to be gained by upgrading existing routes. However, this in turn raises problems of a different nature. The re-use of any structure must necessarily imply the acceptance of unknowns. The upgrading of transmission lines is no exception to this, particularly when assessing foundations, which in their nature are not visible. A risk-based approach is therefore used. This paper describes some of the geotechnical aspects of the assessment of electric power transmission lines for upgrading. It briefly describes the background, then discusses some of the problems encountered and the methods used to address them. These methods are based mainly on information obtained from desk studies and walkover surveys, with a limited amount of intrusive investigation. (orig.)

  17. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  18. Assessing Knowledge Structures in a Constructive Statistical Learning Environment

    NARCIS (Netherlands)

    P.P.J.L. Verkoeijen (Peter); Tj. Imbos; M.W.J. van de Wiel (Margje); M.P.F. Berger; H.G. Schmidt (Henk)

    2002-01-01

    textabstractIn this report, the method of free recall is put forward as a tool to evaluate a prototypical statistical learning environment. A number of students from the faculty of Health Sciences, Maastricht University, the Netherlands, were required to write down whatever they could remember of a

  19. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Reed, J.K.

    1999-10-20

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities.

  20. Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.

    Science.gov (United States)

    Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G

    2016-04-01

    Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting. PMID:27209009

  1. Computer-aided assessment in statistics: the CAMPUS project

    OpenAIRE

    Hunt, Neville

    1998-01-01

    The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assess...

  2. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity.

    Science.gov (United States)

    Narayan, Manjari; Allen, Genevera I

    2016-01-01

    Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940

  3. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity

    Directory of Open Access Journals (Sweden)

    Manjari eNarayan

    2016-04-01

    Full Text Available Many complex brain disorders such as Autism Spectrum Disorders exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches --- R^2 based on resampling and random effects test statistics, and R^3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R^2 and R^3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in Autism Spectrum Disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.

  4. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  5. Jacobian integration method increases the statistical power to measure gray matter atrophy in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Kunio Nakamura

    2014-01-01

    Full Text Available Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials.

  6. JRC Statistical Assessment of the 2015 ICT Development Index

    OpenAIRE

    SAISANA Michaela; DOMINGUEZ TORREIRO MARCOS

    2015-01-01

    Since 2009, the International Telecommunication Union (ITU) has been publishing its annual ICT Development Index (IDI), which benchmarks countries’ performance with regard to ICT infrastructure, use and skills. The JRC analysis, conducted at ITU’s invitation, suggests that the conceptualized three-level structure of the 2015 IDI is statistically sound in terms of coherence and balance, with the overall index as well as the three sub-indices – on ICT access, use and skills – being driven ...

  7. Statistical and RBF NN models : providing forecasts and risk assessment

    OpenAIRE

    Marček, Milan

    2009-01-01

    Forecast accuracy of economic and financial processes is a popular measure for quantifying the risk in decision making. In this paper, we develop forecasting models based on statistical (stochastic) methods, sometimes called hard computing, and on a soft method using granular computing. We consider the accuracy of forecasting models as a measure for risk evaluation. It is found that the risk estimation process based on soft methods is simplified and less critical to the question w...

  8. Assessing the South African Brain Drain, a Statistical Comparison

    OpenAIRE

    Jean-Baptiste Meyer; Mercy Brown; David Kaplan

    2000-01-01

    For several decades the analysis of the so-called brain drain has been hampered by measurement problems. It is now recognised that the official figures significantly underestimate the extent of the brain drain phenomenon and its increase since the political changes in the mid-1990's. This paper, using data from various reliable sources, provides new statistical evidence on the size of the brain drain from South Africa. It compares two methods used to arrive at a more realistic picture of the ...

  9. The power of alternative assessments (AAs)

    Institute of Scientific and Technical Information of China (English)

    张千茜

    2013-01-01

    This article starts by discussing the potential disadvantages of traditional assessment towards young English as a Second Language (ESL) learners within the American public school education system. In response to such disadvantages, researchers ’call for the implementation of alternative assessments (AAs) is therefore introduced along with the various benefits of AAs. However, the current mainstream education policy in the US, namely No Child Left Behind (NCLB) Policy, is still largely based on the tra-ditional ways of testing, making policy-oriented implementation of AAs on large scales remarkably difficult. After careful analysis, the author points out several implications concerning how, under such an existing policy of NCLB, can practitioners effectively accommodate young ESL learners by applying the power of AAs.

  10. Using DEWIS and R for Multi-Staged Statistics e-Assessments

    Science.gov (United States)

    Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.

    2016-01-01

    We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…

  11. Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon

    2011-01-01

    The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including…

  12. Comparison of Three Common Experimental Designs to Improve Statistical Power When Data Violate Parametric Assumptions.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    A Monte Carlo technique was used to investigate the small sample goodness of fit and statistical power of several nonparametric tests and their parametric analogues when applied to data which violate parametric assumptions. The motivation was to facilitate choice among three designs, simple random assignment with and without a concomitant variable…

  13. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    Science.gov (United States)

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  14. Statistically Based Approach to Broadband Liner Design and Assessment

    Science.gov (United States)

    Nark, Douglas M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    A broadband liner design optimization includes utilizing in-duct attenuation predictions with a statistical fan source model to obtain optimum impedance spectra over a number of flow conditions for one or more liner locations in a bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners having impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increasing weighting to specific frequencies and/or operating conditions. One or more broadband design approaches are utilized to produce a broadband liner that targets a full range of frequencies and operating conditions.

  15. Environmental assessment of submarine power cables

    International Nuclear Information System (INIS)

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  16. Environmental assessment of submarine power cables

    Energy Technology Data Exchange (ETDEWEB)

    Isus, Daniel; Martinez, Juan D. [Grupo General Cable Sistemas, S.A., 08560-Manlleu, Barcelona (Spain); Arteche, Amaya; Del Rio, Carmen; Madina, Virginia [Tecnalia Research and Innovation, 20009 San Sebastian (Spain)

    2011-03-15

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  17. Statistical analysis of human maintenance failures of a nuclear power plant

    International Nuclear Information System (INIS)

    In this paper, a statistical study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components are especially prone to human failures. Many human failures were found in safety related systems. Similarly, several failures remained latent from outages to power operation. The safety significance was generally small. Modifications are an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more, in future. (orig.)

  18. Computer-aided assessment in statistics: the CAMPUS project

    Directory of Open Access Journals (Sweden)

    Neville Hunt

    1998-12-01

    Full Text Available The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assessor. While computer-aided assessment (CAA has an enthusiastic following, it is not clear to many teachers that it either reduces workloads or reduces the risk of cheating. In an ideal world, most teachers would prefer to give individual attention and personal feedback to each student when marking their work. In this sense CAA must be seen as second best and will therefore be used only if it is seen to offer significant benefits in terms of reduced workloads or increased validity.

  19. A Teacher's Guide to Assessment Concepts and Statistics

    Science.gov (United States)

    Newman, Carole; Newman, Isadore

    2013-01-01

    The concept of teacher accountability assumes teachers will use data-driven decision making to plan and deliver appropriate and effective instruction to their students. In order to do so, teachers must be able to accurately interpret the data that is given to them, and that requires the knowledge of some basic concepts of assessment and…

  20. Statistical assessment of groundwater resources in Washim district (India).

    Science.gov (United States)

    Rajankar, P N; Tambekar, D H; Ramteke, D S; Wate, S R

    2011-01-01

    Groundwater quality of Washim district of Maharashtra (India) was assessed using quality parameters and water quality index (WQI). In this study, the WQI was analyzed by using pH, turbidity, temperature, nitrates, total phosphates, dissolved oxygen, biochemical oxygen demand, total solids, total coliforms and faecal coliforms, respectively for residential and commercial uses. All the parameters were analyzed both in pre-monsoon and post-monsoon seasons to assess the groundwater quality and seasonal variations. The parameters like turbidity, solids and coliforms showed the seasonal variations. The WQI varied from 72 to 88 in pre-monsoon season and 64 to 88 in post-monsoon season. The results indicate that all groundwater samples in the study area have good water quality in pre-monsoon season but in post-monsoon season 9 percent samples indicated the change in water quality from good to medium, which reveals seasonal variation and groundwater quality deterioration.

  1. Water Polo Game-Related Statistics in Women’s International Championships: Differences and Discriminatory Power

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.

    2012-01-01

    The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID

  2. Waste Heat to Power Market Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Elson, Amelia [ICF International, Fairfax, VA (United States); Tidball, Rick [ICF International, Fairfax, VA (United States); Hampson, Anne [ICF International, Fairfax, VA (United States)

    2015-03-01

    Waste heat to power (WHP) is the process of capturing heat discarded by an existing process and using that heat to generate electricity. In the industrial sector, waste heat streams are generated by kilns, furnaces, ovens, turbines, engines, and other equipment. In addition to processes at industrial plants, waste heat streams suitable for WHP are generated at field locations, including landfills, compressor stations, and mining sites. Waste heat streams are also produced in the residential and commercial sectors, but compared to industrial sites these waste heat streams typically have lower temperatures and much lower volumetric flow rates. The economic feasibility for WHP declines as the temperature and flow rate decline, and most WHP technologies are therefore applied in industrial markets where waste heat stream characteristics are more favorable. This report provides an assessment of the potential market for WHP in the industrial sector in the United States.

  3. Statistical evaluation of malfunctions in wind power plants; Statistische Fehlerauswertungen beim Windkraftwerksbetrieb zur Optimierung der Verfuegbarkeit

    Energy Technology Data Exchange (ETDEWEB)

    Fleischer, C.; Sucrow, W. [E.ON Energy Projects GmbH, Muenchen (Germany)

    2007-07-01

    New challenges by wind energy are risen, this means, that availabilities of wind power plants have to be increased as well as minimisation of breakdowns. Ultimately a retrenchment of operational management costs can be realised/achieved. The article gives a review of operational management's taken efforts to adjust manufacturer's frequently inadequate documentation to provide operations - after strenuous classification - with statistical evaluations of incoming error messages. These statistical evaluations lead to the identification of breakdown times as well as idleness times. Finally operation's costs can be monitored in cent per kilowatt hour. (orig.)

  4. Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions

    OpenAIRE

    Pisarenko, V. F.; Sornette, D.

    2011-01-01

    We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent c...

  5. Power Systems Development Facility. Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    The objective of the PSDF would be to provide a modular facility which would support the development of advanced, pilot-scale, coal-based power systems and hot gas clean-up components. These pilot-scale components would be designed to be large enough so that the results can be related and projected to commercial systems. The facility would use a modular approach to enhance the flexibility and capability for testing; consequently, overall capital and operating costs when compared with stand-alone facilities would be reduced by sharing resources common to different modules. The facility would identify and resolve technical barrier, as well as-provide a structure for long-term testing and performance assessment. It is also intended that the facility would evaluate the operational and performance characteristics of the advanced power systems with both bituminous and subbituminous coals. Five technology-based experimental modules are proposed for the PSDF: (1) an advanced gasifier module, (2) a fuel cell test module, (3) a PFBC module, (4) a combustion gas turbine module, and (5) a module comprised of five hot gas cleanup particulate control devices. The final module, the PCD, would capture coal-derived ash and particles from both the PFBC and advanced gasifier gas streams to provide for overall particulate emission control, as well as to protect the combustion turbine and the fuel cell.

  6. Statistical assessment of trophic conditions: squared Euclidean distance approach

    Directory of Open Access Journals (Sweden)

    Chatchai Ratanachai

    2003-05-01

    Full Text Available The classification of trophic conditions of water bodies may often face contradictory cases where a given lake is classified into a trophic category from a trophic variable, whereas it is classified into another trophic category from other trophic variables. To solve this problem, this paper proposes a new methodology based on the concepts of squared Euclidean distance and the boundary values recommended by the OECD (Organization for Economic Cooperation and Development. This methodology requires that a trophic variable data set of a water body under consideration and such boundary values be compared by a measure of similarity computed by using basic statistical techniques to determine the trophic condition of a given water body. The methodology has been tested by applying it to two sample data sets: the Pattani Dam Reservoir and the North Adriatic Sea data sets, which were taken from Kietpawpan (2002 and Zurlini (1996, respectively. The squared Euclidean distance analysis were then applied to the above data sets in order to classifytrophic conditions, based on four trophic variables comprising total nitrogen, total phosphorus, chlorophylla, and Secchi depth. Our results show that the squared Euclidean distance analysis is a useful methodology for preliminarily classifying trophic conditions and solving contradictory classifications, which often arise when applying the present OECD methodology alone.

  7. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  8. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    Directory of Open Access Journals (Sweden)

    L. Palatella

    2010-07-01

    Full Text Available In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy, Ebro river basin (Spain, Po valley (Italy and Antalya province (Turkey. We performed the statistical downscaling using Canonical Correlation Analysis (CCA in two versions: in one case Principal Component Analysis (PCA filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500 as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.

  9. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    Science.gov (United States)

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  10. A COMPREHENSIVE STATISTICAL ASSESSMENT OF STAR-PLANET INTERACTION

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Brendan P.; Gallo, Elena; Pearson, Elliott G. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Wright, Jason T. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2015-02-01

    We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d < 60 pc) planet-hosting solar analogs are combined with archival Chandra, XMM-Newton, and ROSAT coverage of 11 similar stars to construct a sample inoculated against inherent stellar class and planet-detection biases. Survival analysis and Bayesian regression methods (incorporating both measurements errors and X-ray upper limits; 13/23 stars have secure detections) are used to test whether ''hot Jupiter'' hosts are systematically more X-ray luminous than comparable stars with more distant or smaller planets. No significant correlations are present between common proxies for interaction strength (M {sub P}/a {sup 2} or 1/a) versus coronal activity (L {sub X} or L {sub X}/L {sub bol}). In contrast, a sample of 198 FGK main-sequence stars does show a significant (∼99% confidence) increase in X-ray luminosity with M {sub P}/a {sup 2}. While selection biases are incontrovertibly present within the main-sequence sample, we demonstrate that the effect is primarily driven by a handful of extreme hot-Jupiter systems with M {sub P}/a {sup 2} > 450 M {sub Jup} AU{sup –2}, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in L {sub X}. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence.

  11. A new non-invasive statistical method to assess the spontaneous cardiac baroreflex in humans.

    Science.gov (United States)

    Ducher, M; Fauvel, J P; Gustin, M P; Cerutti, C; Najem, R; Cuisinaud, G; Laville, M; Pozet, N; Paultre, C Z

    1995-06-01

    1. A new method was developed to evaluate cardiac baroreflex sensitivity. The association of a high systolic blood pressure with a low heart rate or the converse is considered to be under the influence of cardiac baroreflex activity. This method is based on the determination of the statistical dependence between systolic blood pressure and heart rate values obtained non-invasively by a Finapres device. Our computerized analysis selects the associations with the highest statistical dependence. A 'Z-coefficient' quantifies the strength of the statistical dependence. The slope of the linear regression, computed on these selected associations, is used to estimate baroreflex sensitivity. 2. The present study was carried out in 11 healthy resting male subjects. The results obtained by the 'Z-coefficient' method were compared with those obtained by cross-spectrum analysis, which has already been validated in humans. Furthermore, the reproducibility of both methods was checked after 1 week. 3. The results obtained by the two methods were significantly correlated (r = 0.78 for the first and r = 0.76 for the second experiment, P < 0.01). When repeated after 1 week, the average results were not significantly different. Considering individual results, test-retest correlation coefficients were higher with the Z-analysis (r = 0.79, P < 0.01) than with the cross-spectrum analysis (r = 0.61, P < 0.05). 4. In conclusion, as the Z-method gives results similar to but more reproducible than the cross-spectrum method, it might be a powerful and reliable tool to assess baroreflex sensitivity in humans.

  12. A Statistical Approach to Planning Reserved Electric Power for Railway Infrastructure Administration

    OpenAIRE

    M. Brabec; Pelikán, E. (Emil); Konár, O. (Ondřej); Kasanický, I.; Juruš, P. (Pavel); Sadil, J.; Blažek, P.

    2013-01-01

    One of the requirements on railway infrastructure administration is to provide electricity for day-to-day operation of railways. We propose a statistically based approach for the estimation of maximum 15-minute power within a calendar month for a given region. This quantity serves as a basis of contracts between railway infrastructure administration and electricity distribution system operator. We show that optimization of the prediction is possible, based on underlying loss function deriv...

  13. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    OpenAIRE

    Lee, Chaeyoung

    2012-01-01

    Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, wi...

  14. Statistical Design Model (SDM) of power supply and communication subsystem's Satellite

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza

    In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem

  15. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  16. Assessing the performance of statistical validation tools for megavariate metabolomics data

    NARCIS (Netherlands)

    Rubingh, C.M.; Bijlsma, S.; Derks, E.P.P.A.; Bobeldijk, I.; Verheij, E.R.; Kochhar, S.; Smilde, A.K.

    2006-01-01

    Statistical model validation tools such as cross-validation, jack-knifing model parameters and permutation tests are meant to obtain an objective assessment of the performance and stability of a statistical model. However, little is known about the performance of these tools for megavariate data set

  17. Implementation of a Model Output Statistics based on meteorological variable screening for short‐term wind power forecast

    DEFF Research Database (Denmark)

    Ranaboldo, Matteo; Giebel, Gregor; Codina, Bernat

    2013-01-01

    A combination of physical and statistical treatments to post‐process numerical weather predictions (NWP) outputs is needed for successful short‐term wind power forecasts. One of the most promising and effective approaches for statistical treatment is the Model Output Statistics (MOS) technique...

  18. Wind Power Assessment Based on a WRF Wind Simulation with Developed Power Curve Modeling Methods

    OpenAIRE

    Zhenhai Guo; Xia Xiao

    2014-01-01

    The accurate assessment of wind power potential requires not only the detailed knowledge of the local wind resource but also an equivalent power curve with good effect for a local wind farm. Although the probability distribution functions (pdfs) of the wind speed are commonly used, their seemingly good performance for distribution may not always translate into an accurate assessment of power generation. This paper contributes to the development of wind power assessment based on the wind speed...

  19. Efficient Statistical Leakage Power Analysis Method for Function Blocks Considering All Process Variations

    Institute of Scientific and Technical Information of China (English)

    LUO Zuying

    2007-01-01

    With technology scaling into nanometer regime, rampant process variations impact visible influences on leakage power estimation of very large scale integrations (VLSIs). In order to deal with the case of large inter- and intra-die variations, we induce a novel theory prototype of the statistical leakage power analysis (SLPA) for function blocks. Because inter-die variations can be pinned down into a small range but the number of gates in function blocks is large(>1000), we continue to simplify the prototype. At last, we induce the efficient methodology of SLPA. The method can save much running time for SLPA in the low power design since it is of the local-updating advantage. A large number of experimental data show that the method only takes feasible running time (0.32 s) to obtain accurate results (3 σ-error <0.5% on maximum) as function block circuits simultaneous suffer from 7.5%(3 σ/mean) inter-die and 7.5% intra-die length variations, which demonstrates that our method is suitable for statistical leakage power analysis of VLSIs under rampant process variations.

  20. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy.

  1. Predicting future wind power generation and power demand in France using statistical downscaling methods developed for hydropower applications

    Science.gov (United States)

    Najac, Julien

    2014-05-01

    For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.

  2. In vivo Comet assay – statistical analysis and power calculations of mice testicular cells

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne;

    2014-01-01

    is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636......The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary...... statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim...

  3. A spatial accuracy assessment of an alternative circular scan method for Kulldorff's spatial scan statistic

    OpenAIRE

    Read, S.; Bath, P.A.; Willett, P.; Maheswaran, R.

    2009-01-01

    This paper concerns the Bernoulli version of Kulldorff’s spatial scan statistic, and how accurately it identifies the exact centre of approximately circular regions of increased spatial density in point data. We present an alternative method of selecting circular regions that appears to give greater accuracy. Performance is tested in an epidemiological context using manifold synthetic case-control datasets. A small, but statistically significant, improvement is reported. The power of the alte...

  4. Efficient statistical analysis method of power/ground (P/G) network

    Institute of Scientific and Technical Information of China (English)

    Zuying Luo; Sheldon X.D. Tan

    2008-01-01

    In this paper, we propose an incremental statistical analysis method with complexity reduction as a pre-process for on-chip power/ground (P/G) networks. The new method exploits locality of P/G network analyses and aims at P/G networks with a large number of strongly connected subcircuits (called strong connects) such as trees and chains. The method consists of three steps. First it compresses P/G circuits by removing strong connects. As a result, current variations (CVs) of nodes in strong connects are transferred to some remain-ing nodes. Then based on the locality of power grid voltage responses to its current inputs, it efficiently calculates the correlative resistor (CR) matrix in a local way to directly compute the voltage variations by using small parts of the remaining circuit. Last it statistically recovers voltage variations of the suppressed nodes inside strong connects. This new method for statistically compressing and expanding strong connects in terms of current or voltage variations in a closed form is very efficient owning to its property of incremental analysis. Experimental results demonstrate that the method can efficiently compute low-bounds of voltage variations for P/G networks and it has two or three orders of magnitudes speedup over the traditional Monte-Carlo-based simulation method, with only 2.0% accuracy loss.

  5. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    Science.gov (United States)

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  6. Assessment - A Powerful Lever for Learning

    Directory of Open Access Journals (Sweden)

    Lorna Earl

    2010-05-01

    Full Text Available Classroom assessment practices have been part of schooling for hundreds of years. There are, however, new findings about the nature of learning and about the roles that assessment can play in enhancing learning for all students. This essay provides a brief history of the changing role of assessment in schooling, describes three different purposes for assessment and foreshadows some implications that shifting to a more differentiated view of assessment can have for policy, practice and research.

  7. Theoretical Foundations and Mathematical Formalism of the Power-Law Tailed Statistical Distributions

    Directory of Open Access Journals (Sweden)

    Giorgio Kaniadakis

    2013-09-01

    Full Text Available We present the main features of the mathematical theory generated by the √ κ-deformed exponential function expκ(x = ( 1 + κ2x2 + κx1/κ, with 0 ≤ κ < 1, developed in the last twelve years, which turns out to be a continuous one parameter deformation of the ordinary mathematics generated by the Euler exponential function. The κ-mathematics has its roots in special relativity and furnishes the theoretical foundations of the κ-statistical mechanics predicting power law tailed statistical distributions, which have been observed experimentally in many physical, natural and artificial systems. After introducing the κ-algebra, we present the associated κ-differential and κ-integral calculus. Then, we obtain the corresponding κ-exponential and κ-logarithm functions and give the κ-version of the main functions of the ordinary mathematics.

  8. Type I error and statistical power of the Mantel-Haenszel procedure for detecting DIF: a meta-analysis.

    Science.gov (United States)

    Guilera, Georgina; Gómez-Benito, Juana; Hidalgo, Maria Dolores; Sánchez-Meca, Julio

    2013-12-01

    This article presents a meta-analysis of studies investigating the effectiveness of the Mantel-Haenszel (MH) procedure when used to detect differential item functioning (DIF). Studies were located electronically in the main databases, representing the codification of 3,774 different simulation conditions, 1,865 related to Type I error and 1,909 to statistical power. The homogeneity of effect-size distributions was assessed by the Q statistic. The extremely high heterogeneity in both error rates (I² = 94.70) and power (I² = 99.29), due to the fact that numerous studies test the procedure in extreme conditions, means that the main interest of the results lies in explaining the variability in detection rates. One-way analysis of variance was used to determine the effects of each variable on detection rates, showing that the MH test was more effective when purification procedures were used, when the data fitted the Rasch model, when test contamination was below 20%, and with sample sizes above 500. The results imply a series of recommendations for practitioners who wish to study DIF with the MH test. A limitation, one inherent to all meta-analyses, is that not all the possible moderator variables, or the levels of variables, have been explored. This serves to remind us of certain gaps in the scientific literature (i.e., regarding the direction of DIF or variances in ability distribution) and is an aspect that methodologists should consider in future simulation studies. PMID:24127986

  9. Methodology for Assessment of Inertial Response from Wind Power Plants

    DEFF Research Database (Denmark)

    Altin, Müfit; Teodorescu, Remus; Bak-Jensen, Birgitte;

    2012-01-01

    High wind power penetration levels result in additional requirements from wind power in order to improve frequency stability. Replacement of conventional power plants with wind power plants reduces the power system inertia due to the wind turbine technology. Consequently, the rate of change...... of frequency and the maximum frequency deviation increase after a disturbance such as generation loss, load increase, etc. Having no inherent inertial response, wind power plants need additional control concepts in order to provide an additional active power following a disturbance. Several control concepts...... have been implemented in the literature, but the assessment of these control concepts with respect to power system requirements has not been specified. In this paper, a methodology to assess the inertial response from wind power plants is proposed. Accordingly, the proposed methodology is applied...

  10. Observer variability in the assessment of type and dysplasia of colorectal adenomas, analyzed using kappa statistics

    DEFF Research Database (Denmark)

    Jensen, P; Krogsgaard, M R; Christiansen, J;

    1995-01-01

    of adenomas were assessed twice by three experienced pathologists, with an interval of two months. Results were analyzed using kappa statistics. RESULTS: For agreement between first and second assessment (both type and grade of dysplasia), kappa values for the three specialists were 0.5345, 0.9022, and 0...... agreement was only fair to moderate. A simpler classification system or a centralization of assessments would probably increase kappa values....

  11. The probability of identification: applying ideas from forensic statistics to disclosure risk assessment

    OpenAIRE

    Chris J. Skinner

    2007-01-01

    The paper establishes a correspondence between statistical disclosure control and forensic statistics regarding their common use of the concept of ‘probability of identification’. The paper then seeks to investigate what lessons for disclosure control can be learnt from the forensic identification literature. The main lesson that is considered is that disclosure risk assessment cannot, in general, ignore the search method that is employed by an intruder seeking to achieve disclosure. The effe...

  12. Classification of Underlying Causes of Power Quality Disturbances: Deterministic versus Statistical Methods

    Directory of Open Access Journals (Sweden)

    Emmanouil Styvaktakis

    2007-01-01

    Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.

  13. Hybrid algorithm for rotor angle security assessment in power systems

    OpenAIRE

    D. Prasad Wadduwage; Udaya D. Annakkage; Christine Qiong Wu

    2015-01-01

    Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA) in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on ...

  14. Assessing power grid reliability using rare event simulation

    OpenAIRE

    Wadman, Wander

    2015-01-01

    Renewable energy generators such as wind turbines and solar panels supply more and more power in modern electrical grids. Although the transition to a sustainable power supply is desirable, considerable implementation of distributed and intermittent generators may strain the power grid. Since grid operators are responsible for a highly reliable power grid, they want to estimate to what extent violations of grid stability constraints occur. To assess grid reliability over a period of interest,...

  15. Air-chemistry "turbulence": power-law scaling and statistical regularity

    Directory of Open Access Journals (Sweden)

    H.-m. Hsu

    2011-08-01

    Full Text Available With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3 and aerosol (PM10 measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days peaks are also identified with the continuous wavelet transform (CWT. The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of −5/3 and −1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, −3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.

  16. A generalized model to estimate the statistical power in mitochondrial disease studies involving 2×k tables.

    Directory of Open Access Journals (Sweden)

    Jacobo Pardo-Seco

    Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.

  17. Safety Assessment - Swedish Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kjellstroem, B. [Luleaa Univ. of Technology (Sweden)

    1996-12-31

    After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs.

  18. Safety Assessment - Swedish Nuclear Power Plants

    International Nuclear Information System (INIS)

    After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs

  19. Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests

    Science.gov (United States)

    Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.

    2015-01-01

    The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.

  20. Factors influencing the statistical power of complex data analysis protocols for molecular signature development from microarray data.

    Directory of Open Access Journals (Sweden)

    Constantin F Aliferis

    Full Text Available BACKGROUND: Critical to the development of molecular signatures from microarray and other high-throughput data is testing the statistical significance of the produced signature in order to ensure its statistical reproducibility. While current best practices emphasize sufficiently powered univariate tests of differential expression, little is known about the factors that affect the statistical power of complex multivariate analysis protocols for high-dimensional molecular signature development. METHODOLOGY/PRINCIPAL FINDINGS: We show that choices of specific components of the analysis (i.e., error metric, classifier, error estimator and event balancing have large and compounding effects on statistical power. The effects are demonstrated empirically by an analysis of 7 of the largest microarray cancer outcome prediction datasets and supplementary simulations, and by contrasting them to prior analyses of the same data. CONCLUSIONS/SIGNIFICANCE: THE FINDINGS OF THE PRESENT STUDY HAVE TWO IMPORTANT PRACTICAL IMPLICATIONS: First, high-throughput studies by avoiding under-powered data analysis protocols, can achieve substantial economies in sample required to demonstrate statistical significance of predictive signal. Factors that affect power are identified and studied. Much less sample than previously thought may be sufficient for exploratory studies as long as these factors are taken into consideration when designing and executing the analysis. Second, previous highly-cited claims that microarray assays may not be able to predict disease outcomes better than chance are shown by our experiments to be due to under-powered data analysis combined with inappropriate statistical tests.

  1. Knowledge based system for fouling assessment of power plant boiler

    International Nuclear Information System (INIS)

    The paper presents the design of an expert system for fouling assessment in power plant boilers. It is an on-line expert system based on selected criteria for the fouling assessment. Using criteria for fouling assessment based on 'clean' and 'not-clean' radiation heat flux measurements, the diagnostic variable are defined for the boiler heat transfer surface. The development of the prototype knowledge-based system for fouling assessment in power plants boiler comprise the integrations of the elements including knowledge base, inference procedure and prototype configuration. Demonstration of the prototype knowledge-based system for fouling assessment was performed on the Sines power plant. It is a 300 MW coal fired power plant. 12 fields are used with 3 on each side of boiler

  2. Assessing record linkage between health care and Vital Statistics databases using deterministic methods

    OpenAIRE

    Quan Hude; Li Bing; Fong Andrew; Lu Mingshan

    2006-01-01

    Abstract Background We assessed the linkage and correct linkage rate using deterministic record linkage among three commonly used Canadian databases, namely, the population registry, hospital discharge data and Vital Statistics registry. Methods Three combinations of four personal identifiers (surname, first name, sex and date of birth) were used to determine the optimal combination. The correct linkage rate was assessed using a unique personal health number available in all three databases. ...

  3. Business Statistics and Management Science Online: Teaching Strategies and Assessment of Student Learning

    Science.gov (United States)

    Sebastianelli, Rose; Tamimi, Nabil

    2011-01-01

    Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…

  4. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  5. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics

    Science.gov (United States)

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...

  6. Sample Size Requirements for Assessing Statistical Moments of Simulated Crop Yield Distributions

    NARCIS (Netherlands)

    Lehmann, N.; Finger, R.; Klein, T.; Calanca, P.

    2013-01-01

    Mechanistic crop growth models are becoming increasingly important in agricultural research and are extensively used in climate change impact assessments. In such studies, statistics of crop yields are usually evaluated without the explicit consideration of sample size requirements. The purpose of t

  7. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  8. Statistical power calculation and sample size determination for environmental studies with data below detection limits

    Science.gov (United States)

    Shao, Quanxi; Wang, You-Gan

    2009-09-01

    Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.

  9. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  10. Assessment and financing of electric power projects

    International Nuclear Information System (INIS)

    The aim of the appraisal of a project is to examine the economic need which a project is designed to meet, to judge whether the project is likely to meet this need in an efficient way, and to conclude what conditions should be attached to eventual Bank financing. Bank involvement continues throughout the life of the project helping to ensure that each project is carried out at the least possible cost and that it makes the expected contribution to the country's development. This paper gives an idea about the origin, nature and functions of the World Bank Group, describes the criteria used by the Bank in its power project appraisals, discusses the Bank's views on nuclear power, and concludes with a review of past lending and probable future sources of financing of electrical expansion in the less developed countries. (orig./UA)

  11. TIDAL POWER: Economic and Technological Assessment

    OpenAIRE

    Montllonch Araquistain, Tatiana

    2010-01-01

    At present time there is concern over global climate change, as well as a growing awareness on worldwide population about the need on reducing greenhouse gas emissions. This in fact, has led to an increase in power generation from renewable sources. Tidal energy has the potential to play a valuable role in a sustainable energy future. Its main advantage over other renewable sources is its predictability; tides can be predicted years in advanced. The energy extracted from the tides can come fr...

  12. Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions

    CERN Document Server

    Pisarenko, V F

    2011-01-01

    We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent characterizing the null hypothesis. We demonstrate how to apply these two tests on the distributions of cities and of agglomerations in a number of countries. We find the following evidence for Dragon-Kings: London in the distribution of city sizes of Great Britain; Moscow and St-Petersburg in the distribution of city sizes in the Russian Federation; and Paris in the distribution of agglomeration sizes in France. True negatives are also reported, for instance the absence of Dragon-Kings in the distribution of cities in Ger...

  13. Detecting temporal change in freshwater fisheries surveys: statistical power and the important linkages between management questions and monitoring objectives

    Science.gov (United States)

    Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,

    2016-01-01

    Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.

  14. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  15. Decision tree approach to power systems security assessment

    OpenAIRE

    Wehenkel, Louis; Pavella, Mania

    1993-01-01

    An overview of the general decision tree approach to power system security assessment is presented. The general decision tree methodology is outlined, modifications proposed in the context of transient stability assessment are embedded, and further refinements are considered. The approach is then suitably tailored to handle other specifics of power systems security, relating to both preventive and emergency voltage control, in addition to transient stability. Trees are accordingly built in th...

  16. Nuclear power plant performance statistics. Comparison with fossil-fired units

    International Nuclear Information System (INIS)

    The joint UNIPEDE/World Energy Conference Committee on Availability of Thermal Generating Plants has a mandate to study the availability of thermal plants and the different factors that influence it. This has led to the collection and publication at the Congress of the World Energy Conference (WEC) every third year of availability and unavailability factors to be used in systems reliability studies and operations and maintenance planning. For nuclear power plants the joint UNIPEDE/WEC Committee relies on the IAEA to provide availability and unavailability data. The IAEA has published an annual report with operating data from nuclear plants in its Member States since 1971, covering in addition back data from the early 1960s. These reports have developed over the years and in the early 1970s the format was brought into close conformity with that used by UNIPEDE and WEC to report performance of fossil-fired generating plants. Since 1974 an annual analytical summary report has been prepared. In 1981 all information on operating experience with nuclear power plants was placed in a computer file for easier reference. The computerized Power Reactor Information System (PRIS) ensures that data are easily retrievable and at its present level it remains compatible with various national systems. The objectives for the IAEA data collection and evaluation have developed significantly since 1970. At first, the IAEA primarily wanted to enable the individual power plant operator to compare the performance of his own plant with that of others of the same type; when enough data had been collected, they provided the basis for assessment of the fundamental performance parameters used in economic project studies; now, the data base merits being used in setting availability objectives for power plant operations. (author)

  17. Statistical analysis of wind power in the region of Veracruz (Mexico)

    Energy Technology Data Exchange (ETDEWEB)

    Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)

    2009-06-15

    The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)

  18. Transient stability risk assessment of power systems incorporating wind farms

    DEFF Research Database (Denmark)

    Miao, Lu; Fang, Jiakun; Wen, Jinyu;

    2013-01-01

    Large-scale wind farm integration has brought several aspects of challenges to the transient stability of power systems. This paper focuses on the research of the transient stability of power systems incorporating with wind farms by utilizing risk assessment methods. The detailed model of double ...

  19. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  20. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power.

  1. Evaluation of a regional monitoring program's statistical power to detect temporal trends in forest health indicators

    Science.gov (United States)

    Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.

    2014-01-01

    Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.

  2. The Statistical Analysis and Assessment of the Solvency of Forest Enterprises

    Directory of Open Access Journals (Sweden)

    Vyniatynska Liudmila V.

    2016-05-01

    Full Text Available The aim of the article is to conduct a statistical analysis of the solvency of forest enterprises through a system of statistical indicators using the sampling method (the sampling is based on the criteria of forest cover percent of regions of Ukraine. Using financial statements of forest enterprises that form a system of information and analytical support for the statistical analysis of the level of solvency of forestry in Ukraine for 2009-2015 has been analyzed and evaluated. With the help of the developed recommended values the results of the statistical analysis of the forest enterprises’ solvency under conditions of self-financing and commercial consideration have been summarized and systematized. Using the methodology of the statistical analysis of the forest enterprises’ solvency conducted on the corresponding conceptual framework, which is relevant and meets the current needs, a system of statistical indicators enabling to assess the level of solvency of forest enterprises and identify the reasons of its low level has been calculated.

  3. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  4. Data base of accident and agricultural statistics for transportation risk assessment

    International Nuclear Information System (INIS)

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs

  5. Probabilistic safety assessment for optimum nuclear power plant life management (PLiM) theory and application of reliability analysis methods for major power plant components

    CERN Document Server

    Arkadov, G V; Rodionov, A N

    2012-01-01

    Probabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Directing preventative maintenance, this title provides a comprehensive review of the theory and application of these methods.$bProbabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Successful calculation of the reliability and ageing of components is critical for forecasting safety and directing preventative maintenance, and Probabilistic safety assessment for optimum nuclear power plant life management provides a comprehensive review of the theory and application of these methods. Part one reviews probabilistic methods for predicting the reliability of equipment. Following an introduction to key terminology, concepts and definitions, formal-statistical and various physico-statistical approaches are discussed. Approaches based on the use of defect-free models are considered, along with those using binomial distribution and models bas...

  6. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  7. Hybrid algorithm for rotor angle security assessment in power systems

    Directory of Open Access Journals (Sweden)

    D. Prasad Wadduwage

    2015-08-01

    Full Text Available Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on the concept of Lyapunov exponents (LEs to determine the transient security of the post-fault power system. Later, the transient secure power swing curves are analysed using an improved Prony algorithm which extracts the dominant oscillatory modes and estimates their damping ratios. The damping ratio is a security measure about the oscillatory security of the post-fault power system subsequent to the contingency. The suitability of the proposed hybrid algorithm for DSA in power systems is illustrated using different contingencies of a 16-generator 68-bus test system and a 50-generator 470-bus test system. The accuracy of the stability conclusions and the acceptable computational burden indicate that the proposed hybrid algorithm is suitable for real-time security assessment with respect to both transient rotor angle stability and oscillatory rotor angle stability under multiple contingencies of the power system.

  8. Assessment of Electric Power Quality in Ships'Modern Systems

    Institute of Scientific and Technical Information of China (English)

    Janusz Mindykowski; XU Xiao-yan

    2004-01-01

    The paper deals with the selected problems of electric power quality in ships'modern systems.In the introduction the fundamentals of electric power quality assessment,such as the relations and consequences among power quality phenomena and indices,secondly as the methods and tools as well as the appropriate instrumentation,have been shortly presented.Afterwards,the basic characteristic of power systems on modern ships has been given.The main focus of the paper is put on the assessment of electric power quality in ships'systems fitted with converter subsystems.The state of the art and actual tendencies in the discussed matter have been shown.Some chosen experimental results,based on the research carried out under supervision of the author,have been presented,too.Finally,some concluding issues have been shortly commented on.

  9. Probabilistic life assessment of steel components used in power plants

    Energy Technology Data Exchange (ETDEWEB)

    Holicky, M.; Markova, J. (Czech Technical Univ. in Prague, Prague (Czech Republic))

    2010-05-15

    Presented life assessment of steel components used in power plants is based on probabilistic methods of the theory of reliability accepted in the International standards ISO 9324, 13833 and ISO 13823. An example of quick-closing valves in a selected hydroelectric power plant illustrates general principles of the reliability and life assessment of steel components for a given model of corrosion. It appears that the probabilistic methods provide rational background information for assessing remaining working life of the components and planning their regular maintenance. The required target reliability level is a key question that remains still open. (orig.)

  10. Statistics of 150-km echoes over Jicamarca based on low-power VHF observations

    Directory of Open Access Journals (Sweden)

    J. L. Chau

    2006-07-01

    Full Text Available In this work we summarize the statistics of the so-called 150-km echoes obtained with a low-power VHF radar operation at the Jicamarca Radio Observatory (11.97 S, 76.87 W, and 1.3 dip angle at 150-km altitude in Peru. Our results are based on almost four years of observations between August 2001 and July 2005 (approximately 150 days per year. The majority of the observations have been conducted between 08:00 and 17:00 LT. We present the statistics of occurrence of the echoes for each of the four seasons as a function of time of day and altitude. The occurrence frequency of the echoes is ~75% around noon and start decreasing after 15:00 LT and disappear after 17:00 LT in all seasons. As shown in previous campaign observations, the 150-echoes appear at a higher altitude (>150 km in narrow layers in the morning, reaching lower altitudes (~135 km around noon, and disappear at higher altitudes (>150 km after 17:00 LT. We show that although 150-km echoes are observed all year long, they exhibit a clear seasonal variability on altitudinal coverage and the percentage of occurrence around noon and early in the morning. We also show that there is a strong day-to-day variability, and no correlation with magnetic activity. Although our results do not solve the 150-km riddle, they should be taken into account when a reasonable theory is proposed.

  11. Assessment of a satellite power system and six alternative technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L.S.; Levine, E.; Tanzman, E.

    1981-04-01

    The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and insitutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included. (LEW)

  12. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  13. Probabilistic safety assessment in nuclear power plant management

    International Nuclear Information System (INIS)

    Probabilistic Safety Assessment (PSA) techniques have been widely used over the past few years to assist in understanding how engineered systems respond to abnormal conditions, particularly during a severe accident. The use of PSAs in the design and operation of such systems thus contributes to the safety of nuclear power plants. Probabilistic safety assessments can be maintained to provide a continuous up-to-date assessment (Living PSA), supporting the management of plant operations and modifications

  14. National-Scale Wind Resource Assessment for Power Generation (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, E. I.

    2013-08-01

    This presentation describes the current standards for conducting a national-scale wind resource assessment for power generation, along with the risk/benefit considerations to be considered when beginning a wind resource assessment. The presentation describes changes in turbine technology and viable wind deployment due to more modern turbine technology and taller towers and shows how the Philippines national wind resource assessment evolved over time to reflect changes that arise from updated technologies and taller towers.

  15. Transient Stability Assessment of Power System with Large Amount of Wind Power Penetration

    DEFF Research Database (Denmark)

    Liu, Leo; Chen, Zhe; Bak, Claus Leth;

    2012-01-01

    ) in different scenarios is proposed to evaluate the vulnerable areas in western Danish power system. The result of CCTs in different scenarios can evaluate the impact of wind power on power system transient stability. Besides, some other influencing factors such as the load level of generators in central power......Recently, the security and stability of power system with large amount of wind power are the concerned issues, especially the transient stability. In Denmark, the onshore and offshore wind farms are connected to distribution system and transmission system respectively. The control and protection...... methodologies of onshore and offshore wind farms definitely affect the transient stability of power system. In this paper, the onshore and offshore wind farms are modeled in detail in order to assess the transient stability of western Danish power system. Further, the computation of critical clearing time (CCT...

  16. Blind image quality assessment using statistical independence in the divisive normalization transform domain

    Science.gov (United States)

    Chu, Ying; Mou, Xuanqin; Fu, Hong; Ji, Zhen

    2015-11-01

    We present a general purpose blind image quality assessment (IQA) method using the statistical independence hidden in the joint distributions of divisive normalization transform (DNT) representations for natural images. The DNT simulates the redundancy reduction process of the human visual system and has good statistical independence for natural undistorted images; meanwhile, this statistical independence changes as the images suffer from distortion. Inspired by this, we investigate the changes in statistical independence between neighboring DNT outputs across the space and scale for distorted images and propose an independence uncertainty index as a blind IQA (BIQA) feature to measure the image changes. The extracted features are then fed into a regression model to predict the image quality. The proposed BIQA metric is called statistical independence (STAIND). We evaluated STAIND on five public databases: LIVE, CSIQ, TID2013, IRCCyN/IVC Art IQA, and intentionally blurred background images. The performances are relatively high for both single- and cross-database experiments. When compared with the state-of-the-art BIQA algorithms, as well as representative full-reference IQA metrics, such as SSIM, STAIND shows fairly good performance in terms of quality prediction accuracy, stability, robustness, and computational costs.

  17. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    Directory of Open Access Journals (Sweden)

    Amzal Billy

    2011-02-01

    Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.

  18. Wind Power Assessment Based on a WRF Wind Simulation with Developed Power Curve Modeling Methods

    Directory of Open Access Journals (Sweden)

    Zhenhai Guo

    2014-01-01

    Full Text Available The accurate assessment of wind power potential requires not only the detailed knowledge of the local wind resource but also an equivalent power curve with good effect for a local wind farm. Although the probability distribution functions (pdfs of the wind speed are commonly used, their seemingly good performance for distribution may not always translate into an accurate assessment of power generation. This paper contributes to the development of wind power assessment based on the wind speed simulation of weather research and forecasting (WRF and two improved power curve modeling methods. These approaches are improvements on the power curve modeling that is originally fitted by the single layer feed-forward neural network (SLFN in this paper; in addition, a data quality check and outlier detection technique and the directional curve modeling method are adopted to effectively enhance the original model performance. The proposed two methods, named WRF-SLFN-OD and WRF-SLFN-WD, are able to avoid the interference from abnormal output and the directional effect of local wind speed during the power curve modeling process. The data examined are from three stations in northern China; the simulation indicates that the two developed methods have strong abilities to provide a more accurate assessment of the wind power potential compared with the original methods.

  19. Probabilistic performance assessment of a coal-fired power plant

    International Nuclear Information System (INIS)

    Highlights: • Power plant equipment is usually oversized to account for input uncertainties. • Oversized equipment degrades its rated efficiency and increases capital cost. • A stochastic methodology to assess probabilities of equipment failure was proposed. • The methodology was proven applicable for design and analysis of the power plants. • Estimated high reliability indices allow reducing power plant equipment oversizing. - Abstract: Despite the low-carbon environmental policies, coal is expected to remain a main source of energy in the coming decades. Therefore, efficient and environmentally friendly power systems are required. A design process based on the deterministic models and application of the safety factors leads to the equipment oversizing, hence fall in the efficiency and increase in the capital and operating costs. In this work, applicability of a non-intrusive stochastic methodology to determine the probability of the power plant equipment failure was investigated. This alternative approach to the power plant performance assessment employs approximation methods for the deterministic prediction of the key performance indicators, which are used to estimate reliability indices based on the uncertainty of the input to a process model of the coal-fired power plant. This study revealed that high reliability indices obtained in the analysis would lead to reduced application of conservative safety factors on the plant equipment, which should result in lower capital and operating cost, through a more reliable assessment of its performance state over its service time, and lead to the optimisation of its inspection and maintenance interventions

  20. Statistical connection of peak counts to power spectrum and moments in weak lensing field

    CERN Document Server

    Shirasaki, Masato

    2016-01-01

    The number density of local maxima of weak lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field $\\cal K$ to a new Gaussian field $y$, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of $\\cal K$ can be reproduced from a single Gaussian field $y$ and monotonic relation between $y$ and $\\cal K$. Therefore, the correct information of two-point clustering and any order of moments in weak lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to ...

  1. Contribution to the power distribution methodology uncertainties assessment

    International Nuclear Information System (INIS)

    The present methodology of safety margins in NPP Dukovany design power distribution calculations is based on the philosophy of engineering factors with errors defined on the bases of statistical approach of standard (95%) confidence intervals. On the level of FA power distribution the normality (normal density distribution) of this approach is tested and comparison with errors defined on the 95-percent probability at a 95-percent confidence level (shortly in statistics 95%/95%)) is provided. Practical applications are presented for several NPP Dukovany fuel cycles. The paper also deals briefly with difference between confidence interval and tolerance interval, with the problems of density distribution of mechanical engineering factor variables and solution of axial and radial error distribution like bivariate problem. (Author)

  2. Sea cliff instability susceptibility at regional scale: a statistically based assessment in the southern Algarve, Portugal

    Science.gov (United States)

    Marques, F. M. S. F.; Matildes, R.; Redweik, P.

    2013-12-01

    Sea cliff evolution is dominated by the occurrence of slope mass movements of different types and sizes, which are a considerable source of natural hazard, making their assessment a relevant issue in terms of human loss prevention and land use regulations. To address the assessment of the spatial component of sea cliff hazards, i.e. the susceptibility, a statistically based study was made to assess the capacity of a set of conditioning factors to express the occurrence of sea cliff failures affecting areas located along their top. The study was based on the application of the bivariate information value and multivariate logistic regression statistical methods, using a set of predisposing factors for cliff failures, mainly related to geology (lithology, bedding dip, faults) and geomorphology (maximum and mean slope, height, aspect, plan curvature, toe protection), which were correlated with a photogrammetry-based inventory of cliff failures that occurred in a 60 yr period (1947-2007). The susceptibility models were validated against the inventory data using standard success rate and ROC curves, and provided encouraging results, indicating that the proposed approaches are effective for susceptibility assessment. The results obtained also stress the need for improvement of the predisposing factors to be used in this type of study and the need for detailed and systematic cliff failure inventories.

  3. Developing a PQ monitoring system for assessing power quality and critical areas detection

    Directory of Open Access Journals (Sweden)

    Miguel Romero

    2011-10-01

    Full Text Available This paper outlines the development of a power quality monitoring system. The system is aimed at assessing power quality and detecting critical areas throughout at distribution system. Such system integrates a hardware system and a software processing tool developed in four main stages. Power quality disturbances are registered by PQ meters and the data is transmitted through a 3G wireless network. This data is processed and filtered in an open source database. Some interesting statistical indices related to voltage sags, swells, flicker and voltage unbalance are obtained. The last stage displays the indices geo-referenced on power quality maps, allowing the identification of critical areas according to different criteria. The results can be analyzed using clustering tools to identify differentiated quality groups in a city. The proposed system is an open source tool useful to electricity utilities to analyze and manage large amount of data.

  4. Mathematical Safety Assessment Approaches for Thermal Power Plants

    OpenAIRE

    Zong-Xiao Yang; Lei Song; Chun-Yang Zhang; Chong Li; Xiao-Bo Yuan

    2014-01-01

    How to use system analysis methods to identify the hazards in the industrialized process, working environment, and production management for complex industrial processes, such as thermal power plants, is one of the challenges in the systems engineering. A mathematical system safety assessment model is proposed for thermal power plants in this paper by integrating fuzzy analytical hierarchy process, set pair analysis, and system functionality analysis. In the basis of those, the key factors in...

  5. Statistics of the Chi-Square Type, with Application to the Analysis of Multiple Time-Series Power Spectra

    CERN Document Server

    Sturrock, P A

    2003-01-01

    It is often necessary to compare the power spectra of two or more time series: one may, for instance, wish to estimate what the power spectrum of the combined data sets might have been, or one may wish to estimate the significance of a particular peak that shows up in two or more power spectra. Also, one may occasionally need to search for a complex of peaks in a single power spectrum, such as a fundamental and one or more harmonics, or a fundamental plus sidebands, etc. Visual inspection can be revealing, but it can also be misleading. This leads one to look for one or more ways of forming statistics, which readily lend themselves to significance estimation, from two or more power spectra. The familiar chi-square statistic provides a convenient mechanism for combining variables drawn from normal distributions, and one may generalize the chi-square statistic to be any function of any number of variables with arbitrary distributions. In dealing with power spectra, we are interested mainly in exponential distri...

  6. The Development of Statistics Textbook Supported with ICT and Portfolio-Based Assessment

    Science.gov (United States)

    Hendikawati, Putriaji; Yuni Arini, Florentina

    2016-02-01

    This research was development research that aimed to develop and produce a Statistics textbook model that supported with information and communication technology (ICT) and Portfolio-Based Assessment. This book was designed for students of mathematics at the college to improve students’ ability in mathematical connection and communication. There were three stages in this research i.e. define, design, and develop. The textbooks consisted of 10 chapters which each chapter contains introduction, core materials and include examples and exercises. The textbook developed phase begins with the early stages of designed the book (draft 1) which then validated by experts. Revision of draft 1 produced draft 2 which then limited test for readability test book. Furthermore, revision of draft 2 produced textbook draft 3 which simulated on a small sample to produce a valid model textbook. The data were analysed with descriptive statistics. The analysis showed that the Statistics textbook model that supported with ICT and Portfolio-Based Assessment valid and fill up the criteria of practicality.

  7. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  8. The significance of structural power in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Hansen, Anne Merrild; Kørnøv, Lone; Cashmore, Matthew Asa;

    2013-01-01

    This article presents a study of how power dynamics enables and constrains the influence of actors upon decision-making and Strategic Environmental Assessment (SEA). Based on Anthony Giddens structuration theory (ST), a model for studying power dynamics in strategic decision-making processes...... to the outcome of the decision-making process. The article is meant as a supplement to the understanding of power dynamics influence in IA processes emphasising the capacity of agents to mobilise and create change. Despite epistemological challenges of using ST theory as an approach to power analysis, this meta...... is developed and used to explore how reflexive agents bring about change. The model is used to map and analyse key decision arenas in the decision process of aluminium production in Greenland. The analysis shows that communication lines are an important resource through which actors exercise power...

  9. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics.

    Science.gov (United States)

    Pleil, Joachim D

    2016-01-01

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, exceedance levels and percentiles. Such comparisons are only valid if the underlying assumptions of distribution are correct. This article discusses methodology for interpreting and evaluating data distributions using quartile-quartile plots (QQ-plots) and making decisions as to how to treat outliers, interpreting effects of mixed distributions, and identifying left-censored data. The QQ-plot graph is shown to be a simple and elegant tool for visual inspection of complex data and deciding if summary statistics should be performed after log-transformation. PMID:27491525

  10. Statistical properties of radiation power levels from a high-gain free-electron laser at and beyond saturation

    International Nuclear Information System (INIS)

    We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond first saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuation level reaches a minimum

  11. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  12. Mission Applicability Assessment of Integrated Power Components and Systems

    Science.gov (United States)

    Raffaelle, R. P.; Hepp, A. F.; Landis, G. A.; Hoffman, D. J.

    2002-01-01

    The need for smaller lightweight autonomous power systems has recently increased with the increasing focus on micro- and nanosatellites. Small area high-efficiency thin film batteries and solar cells are an attractive choice for such applications. The NASA Glenn Research Center, Johns Hopkins Applied Physics Laboratory, Lithium Power Technologies, MicroSat Systems, and others, have been working on the development of autonomous monolithic packages combining these elements or what are called integrated power supplies (IPS). These supplies can be combined with individual satellite components and are capable of providing continuous power even under intermittent illumination associated with a spinning or Earth orbiting satellite. This paper discusses the space mission applicability, benefits, and current development efforts associated with integrated power supply components and systems. The characteristics and several mission concepts for an IPS that combines thin-film photovoltaic power generation with thin-film lithium ion energy storage are described. Based on this preliminary assessment, it is concluded that the most likely and beneficial application of an IPS will be for small "nanosatellites" or in specialized applications serving as a decentralized or as a distributed power source or uninterruptible power supply.

  13. Wind power planning: assessing long-term costs and benefits

    International Nuclear Information System (INIS)

    In the following paper, a new and straightforward technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic load duration curves to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. The model is applied to potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on CO2 charges, and capital costs for wind turbines and IGCC plant is also discussed. The methodology is intended for use by energy planners in assessing the social benefit of future investments in wind power

  14. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  15. Preliminary regulatory assessment of nuclear power plants vulnerabilities

    International Nuclear Information System (INIS)

    Preliminary attempts to develop models for nuclear regulatory vulnerability assessment of nuclear power plants are presented. Development of the philosophy and computer tools could be new and important insight for management of nuclear operators and nuclear regulatory bodies who face difficult questions about how to assess the vulnerability of nuclear power plants and other nuclear facilities to external and internal threats. In the situation where different and hidden threat sources are dispersed throughout the world, the assessment of security and safe operation of nuclear power plants is very important. Capability to evaluate plant vulnerability to different kinds of threats, like human and natural occurrences and terrorist attacks and preparation of emergency response plans and estimation of costs are of vital importance for assurance of national security. On the basis of such vital insights, nuclear operators and nuclear regulatory bodies could plan and optimise changes in oversight procedures, organisations, equipment, hardware and software to reduce risks taking into account security and safety of nuclear power plants operation, budget, manpower, and other limitations. Initial qualitative estimations of adapted assessments for nuclear applications are shortly presented. (author)

  16. Dynamic security risk assessment and optimization of power transmission system

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The paper presents a practical dynamic security region (PDSR) based dynamic security risk assessment and optimization model for power transmission system. The cost of comprehensive security control and the influence of uncertainties of power injections are considered in the model of dynamic security risk assessment. The transient stability constraints and uncertainties of power injections can be considered easily by PDSR in form of hyper-box. A method to define and classify contingency set is presented, and a risk control optimization model is given which takes total dynamic insecurity risk as the objective function for a dominant con-tingency set. An optimal solution of dynamic insecurity risk is obtained by opti-mizing preventive and emergency control cost and contingency set decomposition. The effectiveness of this model has been proved by test results on the New Eng-land 10-genarator 39-bus system.

  17. A study on the reliability and risk assessment of nuclear power plants

    International Nuclear Information System (INIS)

    The final objective of the present study is to establish the foundations of both performing statistical analyses of various failures and potential accidents in nuclear power plants and assessing probabilistic safety. In order to achive the objective, we have chosen the review of the state of the arts in the related methodologies and the establishment of reliability analysis method as study areas in this year. The works performed are summalized here. First, the brief reviews of the present status on the Probabilistic Risk Assessment and the development of quantiative safety goal in the Unite States were completed. It has been identified that the Probabilistic Risk Assessment techniques will take important role in nuclear safety assessment as a supporting tool in coming years. In order to establish reliability analysis methodology, a computer code for updating plant specific reliability data has been developed as a part of this project. A reliability analysis system has been estabished and used to analyize the axiliary feedwater system. (Author)

  18. Assessment of synthetic winds through spectral modelling, rainflow count analysis and statistics of increments

    Science.gov (United States)

    Beyer, Hans Georg; Chougule, Abhijit

    2016-04-01

    While wind energy industry growing rapidly and siting of wind turbines onshore as well as offshore is increasing, many wind engineering model tools have been developed for the assessment of loads on wind turbines due to varying wind speeds. In order to have proper wind turbine design and performance analysis, it is important to have an accurate representation of the incoming wind field. To ease the analysis, tools for the generation of synthetic wind fields have been developed, e.g the widely used TurbSim procedure. We analyse respective synthetic data sets on one hand in view of the similarity of the spectral characteristics of measured and synthetic sets. In addition, second order characteristics with direct relevance to load assessment as given by the statistics of increments and rainflow count results are inspected.

  19. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  20. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... a significant role in this assessment and different models have been created for it, but a representation which includes all of them has not been developed yet. This paper deals with this issue. First, a list of nine influencing Factors is presented and discussed. Secondly, these Factors are included...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  1. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... a significant role in this assessment and different models have been created for it, but a representation which includes all of them has not been developed yet. This paper deals with this issue. First, a list of nine influencing Factors is presented and discussed. Secondly, these Factors are included...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  2. Complementary assessment of the safety of French nuclear power plants

    International Nuclear Information System (INIS)

    As an immediate consequence of the Fukushima accident the French nuclear safety Authority (ASN) asked EDF to perform a complementary safety assessment for each nuclear power plant dealing with 3 points: 1) the consequences of exceptional natural disasters, 2) the consequences of total loss of electrical power, and 3) the management of emergency situations. The safety margin has to be assessed considering 3 main points: first a review of the conformity to the initial safety requirements, secondly the resistance to events overdoing what the facility was designed to stand for, and the feasibility of any modification susceptible to improve the safety of the facility. This article details the specifications of such assessment, the methodology followed by EDF, the task organization and the time schedule. (A.C.)

  3. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  4. Steady state security assessment in deregulated power systems

    Science.gov (United States)

    Manjure, Durgesh Padmakar

    Power system operations are undergoing changes, brought about primarily due to deregulation and subsequent restructuring of the power industry. The primary intention of the introduction of deregulation in power systems was to bring about competition and improved customer focus. The underlying motive was increased economic benefit. Present day power system analysis is much different than what it was earlier, essentially due to the transformation of the power industry from being cost-based to one that is price-based and due to open access of transmission networks to the various market participants. Power is now treated as a commodity and is traded in an open market. The resultant interdependence of the technical criteria and the economic considerations has only accentuated the need for accurate analysis in power systems. The main impetus in security analysis studies is on efficient assessment of the post-contingency status of the system, accuracy being of secondary consideration. In most cases, given the time frame involved, it is not feasible to run a complete AC load flow for determining the post-contingency state of the system. Quite often, it is not warranted as well, as an indication of the state of the system is desired rather than the exact quantification of the various state variables. With the inception of deregulation, transmission networks are subjected to a host of multilateral transactions, which would influence physical system quantities like real power flows, security margins and voltage levels. For efficient asset utilization and maximization of the revenue, more often than not, transmission networks are operated under stressed conditions, close to security limits. Therefore, a quantitative assessment of the extent to which each transaction adversely affects the transmission network is required. This needs to be done accurately as the feasibility of the power transactions and subsequent decisions (execution, curtailment, pricing) would depend upon the

  5. Global uncertainty assessment in hydrological forecasting by means of statistical analysis of forecast errors

    Science.gov (United States)

    Montanari, A.; Grossi, G.

    2007-12-01

    It is well known that uncertainty assessment in hydrological forecasting is a topical issue. Already in 1905 W.E. Cooke, who was issuing daily weather forecasts in Australia, stated: "It seems to me that the condition of confidence or otherwise form a very important part of the prediction, and ought to find expression". Uncertainty assessment in hydrology involves the analysis of multiple sources of error. The contribution of these latter to the formation of the global uncertainty cannot be quantified independently, unless (a) one is willing to introduce subjective assumptions about the nature of the individual error components or (2) independent observations are available for estimating input error, model error, parameter error and state error. An alternative approach, that is applied in this study and still requires the introduction of some assumptions, is to quantify the global hydrological uncertainty in an integrated way, without attempting to quantify each independent contribution. This methodology can be applied in situations characterized by limited data availability and therefore is gaining increasing attention by end users. This work aims to propose a statistically based approach for assessing the global uncertainty in hydrological forecasting, by building a statistical model for the forecast error xt,d, where t is the forecast time and d is the lead time. Accordingly, the probability distribution of xt,d is inferred through a non linear multiple regression, depending on an arbitrary number of selected conditioning variables. These include the current forecast issued by the hydrological model, the past forecast error and internal state variables of the model. The final goal is to indirectly relate the forecast error to the sources of uncertainty, through a probabilistic link with the conditioning variables. Any statistical model is based on assumptions whose fulfilment is to be checked in order to assure the validity of the underlying theory. Statistical

  6. Performance evaluation of hydrological models: Statistical significance for reducing subjectivity in goodness-of-fit assessments

    Science.gov (United States)

    Ritter, Axel; Muñoz-Carpena, Rafael

    2013-02-01

    similar goodness-of-fit indicators but distinct statistical interpretation, and others to analyze the effects of outliers, model bias and repeated data. This work does not intend to dictate rules on model goodness-of-fit assessment. It aims to provide modelers with improved, less subjective and practical model evaluation guidance and tools.

  7. Assessing Fire Weather Index using statistical downscaling and spatial interpolation techniques in Greece

    Science.gov (United States)

    Karali, Anna; Giannakopoulos, Christos; Frias, Maria Dolores; Hatzaki, Maria; Roussos, Anargyros; Casanueva, Ana

    2013-04-01

    Forest fires have always been present in the Mediterranean ecosystems, thus they constitute a major ecological and socio-economic issue. The last few decades though, the number of forest fires has significantly increased, as well as their severity and impact on the environment. Local fire danger projections are often required when dealing with wild fire research. In the present study the application of statistical downscaling and spatial interpolation methods was performed to the Canadian Fire Weather Index (FWI), in order to assess forest fire risk in Greece. The FWI is used worldwide (including the Mediterranean basin) to estimate the fire danger in a generalized fuel type, based solely on weather observations. The meteorological inputs to the FWI System are noon values of dry-bulb temperature, air relative humidity, 10m wind speed and precipitation during the previous 24 hours. The statistical downscaling methods are based on a statistical model that takes into account empirical relationships between large scale variables (used as predictors) and local scale variables. In the framework of the current study the statistical downscaling portal developed by the Santander Meteorology Group (https://www.meteo.unican.es/downscaling) in the framework of the EU project CLIMRUN (www.climrun.eu) was used to downscale non standard parameters related to forest fire risk. In this study, two different approaches were adopted. Firstly, the analogue downscaling technique was directly performed to the FWI index values and secondly the same downscaling technique was performed indirectly through the meteorological inputs of the index. In both cases, the statistical downscaling portal was used considering the ERA-Interim reanalysis as predictands due to the lack of observations at noon. Additionally, a three-dimensional (3D) interpolation method of position and elevation, based on Thin Plate Splines (TPS) was used, to interpolate the ERA-Interim data used to calculate the index

  8. Safety assessment of emergency electric power systems for nuclear power plants

    International Nuclear Information System (INIS)

    This paper is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing a given design of the Emergency Electrical Power System. Those non-electric power systems which may be used in a plant design to serve as emergency energy sources are addressed only in their general safety aspects. The paper thus relates closely to Safety Series 50-SG-D7 ''Emergency Power Systems at Nuclear Power Plants'' (1982), as far as it addresses emergency electric power systems. Several aspects are dealt with: the information the assessor may expect from the applicant to fulfill his task of safety review; the main questions the reviewer has to answer in order to determine the compliance with requirements of the NUSS documents; the national or international standards which give further guidance on a certain system or piece of equipment; comments and suggestions which may help to judge a variety of possible solutions

  9. Self-assessment of operational safety for nuclear power plants

    International Nuclear Information System (INIS)

    Self-assessment processes have been continuously developed by nuclear organizations, including nuclear power plants. Currently, the nuclear industry and governmental organizations are showing an increasing interest in the implementation of this process as an effective way for improving safety performance. Self-assessment involves the use of different types of tools and mechanisms to assist the organizations in assessing their own safety performance against given standards. This helps to enhance the understanding of the need for improvements, the feeling of ownership in achieving them and the safety culture as a whole. Although the primary beneficiaries of the self-assessment process are the plant and operating organization, the results of the self-assessments are also used, for example, to increase the confidence of the regulator in the safe operation of an installation, and could be used to assist in meeting obligations under the Convention on Nuclear Safety. Such considerations influence the form of assessment, as well as the type and detail of the results. The concepts developed in this report present the basic approach to self-assessment, taking into consideration experience gained during Operational Safety Review Team (OSART) missions, from organizations and utilities which have successfully implemented parts of a self-assessment programme and from meetings organized to discuss the subject. This report will be used in IAEA sponsored workshops and seminars on operational safety that include the topic of self-assessment

  10. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    Science.gov (United States)

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. PMID:26653752

  11. A statistical assessment of population trends for data deficient Mexican amphibians

    Directory of Open Access Journals (Sweden)

    Esther Quintero

    2014-12-01

    Full Text Available Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  12. Mathematical Safety Assessment Approaches for Thermal Power Plants

    Directory of Open Access Journals (Sweden)

    Zong-Xiao Yang

    2014-01-01

    Full Text Available How to use system analysis methods to identify the hazards in the industrialized process, working environment, and production management for complex industrial processes, such as thermal power plants, is one of the challenges in the systems engineering. A mathematical system safety assessment model is proposed for thermal power plants in this paper by integrating fuzzy analytical hierarchy process, set pair analysis, and system functionality analysis. In the basis of those, the key factors influencing the thermal power plant safety are analyzed. The influence factors are determined based on fuzzy analytical hierarchy process. The connection degree among the factors is obtained by set pair analysis. The system safety preponderant function is constructed through system functionality analysis for inherence properties and nonlinear influence. The decision analysis system is developed by using active server page technology, web resource integration, and cross-platform capabilities for applications to the industrialized process. The availability of proposed safety assessment approach is verified by using an actual thermal power plant, which has improved the enforceability and predictability in enterprise safety assessment.

  13. Efforts to utilize risk assessment at nuclear power plants

    International Nuclear Information System (INIS)

    Risk assessment means the use of the outputs that have been obtained through risk identification and risk analysis (risk information), followed by the determination of the response policy by comparing these outputs with the risk of judgement standards. This paper discusses the use of risk information with multifaceted nature and its significance, and the challenges to the further penetration of these items. As the lessons and risk assessment learnt from the past accidents, this paper takes up the cases of the severe accidents of Three Mile Island, Chernobyl, and Fukushima Daiichi power stations, and discusses their causes and expansion factors. In particular, at Fukushima Daiichi Nuclear Power Station, important lessons were shortage in measures against the superimposition of earthquake and tsunami, and the insufficient use of risk assessment. This paper classified risk assessment from the viewpoint of risk information, and showed the contents and index for each item of risk reduction trends, risk increase trends, and measures according to the importance of risk. As the benefits of activities due to risk assessment, this paper referred to the application cases of the probabilistic risk assessment (PRA) of IAEA, and summarized the application activities of 10 items of risk indexes by classifying them to safety benefits and operational benefits. For example, in the item of flexible Allowed Outage Time (AOT), the avoidance of plant shutdown and the flexibility improvement of maintenance scheduling at a plant are corresponding to the above-mentioned benefits, respectively. (A.O.)

  14. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  15. Statistical annual report 2008 - Furnas Electric Power Plants Inc. - Calendar year 2007

    International Nuclear Information System (INIS)

    This 30th edition of the statistical annual report of Furnas reports the performance of the company in 2007 and recent years allowing a general view on: Furnas system; production and supply; financial and economic data, personnel and indicators

  16. Accelerating pairwise statistical significance estimation for local alignment by harvesting GPU's power

    OpenAIRE

    Zhang, Yuhong; Misra, Sanchit; Agrawal, Ankit; Patwary, Md Mostofa Ali; Liao, Wei-keng; Qin, Zhiguang; Choudhary, Alok

    2012-01-01

    Background Pairwise statistical significance has been recognized to be able to accurately identify related sequences, which is a very important cornerstone procedure in numerous bioinformatics applications. However, it is both computationally and data intensive, which poses a big challenge in terms of performance and scalability. Results We present a GPU implementation to accelerate pairwise statistical significance estimation of local sequence alignment using standard substitution matrices. ...

  17. A systems assessment of the five Starlite tokamak power plants

    Energy Technology Data Exchange (ETDEWEB)

    Bathke, C.G.

    1996-07-01

    The ARIES team has assessed the power-plant attractiveness of the following five tokamak physics regimes: (1) steady state, first stability regime; (2) pulsed, first stability regime; (3) steady state, second stability regime; (4) steady state, reversed shear; and (5) steady state, low aspect ratio. Cost-based systems analysis of these five tokamak physics regimes suggests that an electric power plant based upon a reversed-shear tokamak is significantly more economical than one based on any of the other four physics regimes. Details of this comparative systems analysis are described herein.

  18. Real-time dynamic security assessment of power grids

    Science.gov (United States)

    Kerin, Uros; Heyde, Chris; Krebs, Rainer; Lerch, Edwin

    2014-10-01

    This paper presents a dynamic security assessment solution, which can be used in the power system control room to improve system stability. It is based on a set of security indices. The indices are able of establishing contingencies' severity levels as a measure of different aspects of power system security. A system based on fuzzy logic is used to combine the indices into a single composite index. The composite index is able to alert the control operator to the network conditions that represent a significant risk to system security based on over-all system performance.

  19. Probabilistic assessment of power system transient stability incorporating SMES

    Science.gov (United States)

    Fang, Jiakun; Yao, Wei; Wen, Jinyu; Cheng, Shijie; Tang, Yuejin; Cheng, Zhuo

    2013-01-01

    This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically.

  20. Study of creep cavity growth for power plant lifetime assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wu Rui; Sandstroem, Rolf

    2001-01-01

    This report aims to the sub project lifetime assessment by creep (livslaengdspredikteringar vid kryp), which is involved in the project package strength in high temperature power plant, KME 708. The physical creep damage includes mainly cavities and their development. Wu and Sandstroem have observed that cavity size increases linearly with increasing creep strain in a 12%Cr steel. Sandstroem has showed that, based on the relations between the nucleation and growth of creep cavities with creep strain, the physical creep damage can be modelled as a function of creep strain. In the present paper the growth of creep cavity radius R in relation to time t and strain {epsilon} in low alloy and 12%Cr steels as well as a Type 347 steel has been studied. The results exhibit that the power law cavity radius with creep time (R-t) and with creep strain (R-{epsilon}) relations are found for these materials at various testing conditions. The power law R-t and R-{epsilon} relations are in most cases dependent and independent on testing conditions, respectively. The empirical power law R-{epsilon} relations give a description of cavity evolution, which can be used for lifetime assessment. Experimental data have also been compared to the estimations by the classical models for cavity growth, including the power law growth due to Hancock, the diffusion growth due to Speight and Harris, the constrained diffusion growths due to Dyson and due to Rice and the enhanced diffusion growth due to Beere. It appears that the constraint diffusion growth models give a reasonable estimation of R-{epsilon} relation in many cases. The diffusion growth model is only applicable for limited cases where the power over t in R-t relation takes about 1/3. The power law and the enhanced diffusion models are found in most cases to overestimate the cavity growth.

  1. Increasing Confidence in a Statistics Course: Assessing Students' Beliefs and Using the Data to Design Curriculum with Students

    Science.gov (United States)

    Huchting, Karen

    2013-01-01

    Students were involved in the curriculum design of a statistics course. They completed a pre-assessment of their confidence and skills using quantitative methods and statistics. Scores were aggregated, and anonymous data were shown on the first night of class. Using these data, the course was designed, demonstrating evidence-based instructional…

  2. Nuclear power plants: 2006 atw compact statistics; atw Schnellstatistik Kernkraftwerke 2006

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2007-01-15

    At the turn of 2006/2007, nuclear power plants were available for energy supply, or under construction, in 32 countries of the world. A total of 437 nuclear power plants, which is 7 plants less than at the 2005/2006 turn, were in operation in 31 countries with an aggregate gross power of approx. 388 GWe and an aggregate net power, respectively, of 369 GWe. The available gross power of nuclear power plants dropped by approx. 1.6 GWe, the available net power, by approx. 1.2 GWe. The Tarapur 3 nuclear generating unit was commissioned in India, a D{sub 2}O PWR of 540 MWe gross power. Power operation was discontinued for good in 2006 only in nuclear power plants in Europe: Bohunice 1 (Slovak Republic, 440/408 MWe, VVER PWR); Kozloduy 3 and Kozloduy 4 (Bulgaria, 440/408 MWe each, VVER PWR); Dungeness A1 and Dungeness A2 (United Kingdom, 245/219 MWe each, Magnox GGR); Sizewell A1 and Sizewell A2 (United Kingdom, 236/210 MWe each, Magnox GGR), and Jose Cabrera 1 (Zorita) (Spain, 160/153 MWe, PWR). 29 nuclear generating units, i.e. 8 plants more than at the end of 2005, with an aggregate gross power of approx. 28 GWe, were under construction in 10 countries end of 2006. In China, construction of the Qinshan II-3, Qinshan II-4 nuclear generating units was started. In the Republic of Korea, construction work began on 4 new projects: Shin Kori 1, Shin Kori 2, and Shin Wolsong 1, Shin Wolsong 2. In Russia, work was resumed on the BN-800 sodium-cooled fast breeder reactor project at Beloyarsk and the RBMK Kursk 5. Some 40 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another approximately seventy units are in their preliminary project phases. (orig.)

  3. World power production by geography and by product in the nineteenth and twentieth centuries. A statistical survey

    Energy Technology Data Exchange (ETDEWEB)

    Etemad, B. (Geneva Univ. (Switzerland))

    1992-10-01

    This paper presents a general view of trends in worldwide commercial primary power production since 1800. Certain series have been extended up until 1990, by special updating effort. To grasp the main trends of this production by product groups (coal, petroleum, natural gas, electricity) and by geographic area, six tables have been drawn up. To make these easier to read, they are accompanied by succinct notes for use. As this is a statistical survey, only structural changes and watershed events in the history of world power production are commented and analyzed. 4 refs., 6 tabs.

  4. Probabilistic assessment of power system transient stability incorporating SMES

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jiakun, E-mail: Jiakun.Fang@gmail.com [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Yao, Wei [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Wen, Jinyu, E-mail: jinyu.wen@hust.edu.cn [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Cheng, Shijie; Tang, Yuejin; Cheng, Zhuo [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China)

    2013-01-15

    Highlights: ► Probabilistic study of power system with wind farm and SMES is proposed. ► Quantitative relationship between system stability and SMES capacity is given. ► System stability increases with the capacity of the SMES. ► System stability decreases with the penetration of wind power. ► Together with the cost function, the coil size is optimized. -- Abstract: This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically.

  5. Probabilistic assessment of power system transient stability incorporating SMES

    International Nuclear Information System (INIS)

    Highlights: ► Probabilistic study of power system with wind farm and SMES is proposed. ► Quantitative relationship between system stability and SMES capacity is given. ► System stability increases with the capacity of the SMES. ► System stability decreases with the penetration of wind power. ► Together with the cost function, the coil size is optimized. -- Abstract: This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically

  6. Assessing incentive policies for integrating centralized solar power generation in the Brazilian electric power system

    International Nuclear Information System (INIS)

    This study assesses the impacts of promoting, through auctions, centralized solar power generation (concentrated solar power – CSP, and photovoltaic solar panels – PV) on the Brazilian power system. Four types of CSP plants with parabolic troughs were simulated at two sites, Bom Jesus da Lapa and Campo Grande, and PV plants were simulated at two other sites, Recife and Rio de Janeiro. The main parameters obtained for each plant were expanded to other suitable sites in the country (totaling 17.2 GW in 2040), as inputs in an optimization model for evaluating the impacts of the introduction of centralized solar power on the expansion of the electricity grid up to 2040. This scenario would be about USD$ 185 billion more expensive than a business as usual scenario, where expansion solely relies on least-cost options. Hence, for the country to incentivize the expansion of centralized solar power, specific auctions for solar energy should be adopted, as well as complementary policies to promote investments in R and D and the use of hybrid systems based on solar and fuels in CSP plants. - Highlights: • We assess the impacts of promoting centralized CSP and PV by auctions in Brazil. • We simulate energy scenarios with and without solar power. • Our solar scenario leads to 17 GW of solar capacity installed between 2020 and 2040. • This solar scenario is some USD$ 185 billion more expensive than the base case

  7. Understanding Statistical Power in Cluster Randomized Trials: Challenges Posed by Differences in Notation and Terminology

    Science.gov (United States)

    Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael

    2014-01-01

    Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…

  8. Influence of motor unit firing statistics on the median frequency of the EMG power spectrum

    NARCIS (Netherlands)

    van Boxtel, Anton; Schomaker, L R

    1984-01-01

    Changes in the EMG power spectrum during static fatiguing contractions are often attributed to changes in muscle fibre action potential conduction velocity. Mathematical models of the EMG power spectrum, which have been empirically confirmed, predict that under certain conditions a distinct maximum

  9. The Power of Student's t and Wilcoxon W Statistics: A Comparison.

    Science.gov (United States)

    Rasmussen, Jeffrey Lee

    1985-01-01

    A recent study (Blair and Higgins, 1980) indicated a power advantage for the Wilcoxon W Test over student's t-test when calculated from a common mixed-normal sample. Results of the present study indicate that the t-test corrected for outliers shows a superior power curve to the Wilcoxon W.

  10. A Powerful Test of the Autoregressive Unit Root Hypothesis Based on a Tuning Parameter Free Statistic

    DEFF Research Database (Denmark)

    Nielsen, Morten Ørregaard

    bandwidth, lag length, etc., but have none of these three properties. It is shown that members of the family with d < 1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric...

  11. The assessment of tornado missile hazard to nuclear power plants

    International Nuclear Information System (INIS)

    Numerical methods and computer codes for assessing tornado missile hazards to nuclear power plants are developed. The method of calculation has been based on the theoretical model developed earlier by authors. Historical data for tornado characteristics are taken from computerized files of the National Severe Storms Forecast Center and potential missiles characteristics are adopted from an EPRI report. Due to the uncertainty and randomness of tornado and tornado-generated missiles' characteristics, the damage probability of targets has highly spread distribution. The proposed method is very useful for assessing the risk of not providing protection to some nonsafety-related targets whose failure can create a hazard to the safe operation of nuclear power plants

  12. Suppressing the non-Gaussian statistics of Renewable Power from Wind and Solar

    CERN Document Server

    Anvari, M; Tabar, M Reza Rahimi; Wächter, M; Milan, P; Heinemann, D; Peinke, Joachim; Lorenz, E

    2015-01-01

    The power from wind and solar exhibits a nonlinear flickering variability, which typically occurs at time scales of a few seconds. We show that high-frequency monitoring of such renewable powers enables us to detect a transition, controlled by the field size, where the output power qualitatively changes its behaviour from a flickering type to a diffusive stochastic behaviour. We find that the intermittency and strong non-Gaussian behavior in cumulative power of the total field, even for a country-wide installation still survives for both renewable sources. To overcome the short time intermittency, we introduce a time-delayed feedback method for power output of wind farm and solar field that can change further the underlying stochastic process and suppress their strong non- gaussian fluctuations.

  13. Statistical analysis of data from limiting dilution cloning to assess monoclonality in generating manufacturing cell lines.

    Science.gov (United States)

    Quiroz, Jorge; Tsao, Yung-Shyeng

    2016-07-01

    Assurance of monoclonality of recombinant cell lines is a critical issue to gain regulatory approval in biological license application (BLA). Some of the requirements of regulatory agencies are the use of proper documentations and appropriate statistical analysis to demonstrate monoclonality. In some cases, one round may be sufficient to demonstrate monoclonality. In this article, we propose the use of confidence intervals for assessing monoclonality for limiting dilution cloning in the generation of recombinant manufacturing cell lines based on a single round. The use of confidence intervals instead of point estimates allow practitioners to account for the uncertainty present in the data when assessing whether an estimated level of monoclonality is consistent with regulatory requirements. In other cases, one round may not be sufficient and two consecutive rounds are required to assess monoclonality. When two consecutive subclonings are required, we improved the present methodology by reducing the infinite series proposed by Coller and Coller (Hybridoma 1983;2:91-96) to a simpler series. The proposed simpler series provides more accurate and reliable results. It also reduces the level of computation and can be easily implemented in any spreadsheet program like Microsoft Excel. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1061-1068, 2016.

  14. Multivariate statistical assessment of predictors of firefighters' muscular and aerobic work capacity.

    Directory of Open Access Journals (Sweden)

    Ann-Sofie Lindberg

    Full Text Available Physical capacity has previously been deemed important for firefighters physical work capacity, and aerobic fitness, muscular strength, and muscular endurance are the most frequently investigated parameters of importance. Traditionally, bivariate and multivariate linear regression statistics have been used to study relationships between physical capacities and work capacities among firefighters. An alternative way to handle datasets consisting of numerous correlated variables is to use multivariate projection analyses, such as Orthogonal Projection to Latent Structures. The first aim of the present study was to evaluate the prediction and predictive power of field and laboratory tests, respectively, on firefighters' physical work capacity on selected work tasks. Also, to study if valid predictions could be achieved without anthropometric data. The second aim was to externally validate selected models. The third aim was to validate selected models on firefighters' and on civilians'. A total of 38 (26 men and 12 women + 90 (38 men and 52 women subjects were included in the models and the external validation, respectively. The best prediction (R2 and predictive power (Q2 of Stairs, Pulling, Demolition, Terrain, and Rescue work capacities included field tests (R2 = 0.73 to 0.84, Q2 = 0.68 to 0.82. The best external validation was for Stairs work capacity (R2 = 0.80 and worst for Demolition work capacity (R2 = 0.40. In conclusion, field and laboratory tests could equally well predict physical work capacities for firefighting work tasks, and models excluding anthropometric data were valid. The predictive power was satisfactory for all included work tasks except Demolition.

  15. Sea cliff instability susceptibility at regional scale: A statistically based assessment in southern Algarve, Portugal.

    Science.gov (United States)

    Marques, F.; Matildes, R.; Redweik, P.

    2012-04-01

    Mass movements are the dominant process of sea cliff evolution, being a considerable source of natural hazard and a significant constrain for human activities in coastal areas. Related hazards include cliff top retreat, with implications on planning and land management, and unstable soil or rock movements at the cliffs face and toe, with implications mainly on beach users and support structures. To assess the spatial component of sea cliff hazard assessment with implications on planning, i.e. the susceptibility of a given cliff section to be affected by instabilities causing retreat of the cliff top, a statistically based study was carried out along the top of the sea cliffs of Burgau-Lagos coastal section (Southwest Algarve, Portugal). The study was based on bivariate and multi-variate statistics applied to a set of predisposing factors, mainly related with geology and geomorphology, which were correlated with an inventory of past cliff failures. The multi-temporal inventory of past cliff failures was produced using aerial digital photogrammetric methods, which included special procedures to enable the extraction of accurate data from old aerial photos, and validated by systematic stereo photo interpretation, helped by oblique aerial photos and field surveys. This study identified 137 cliff failures occurred between 1947 and 2007 along the 13 km long cliffs, causing the loss of 10,234 m2 of horizontal area at the cliffs top. The cliff failures correspond to planar slides (58%) mainly in Cretaceous alternating limestone and marls, toppling failures (17%) mainly in Miocene calcarenites, slumps (15%) in Plio-pleistocene silty sands that infill the karst in the Miocene rocks, and the remaining 10% correspond to complex movements, rockfalls and not determined cases. The space distribution of cliff failures is quite irregular but enables the objective separation of sub sections with homogeneous retreat behavior, for which were computed mean retreat rates between 5x10-3m

  16. Statistical Analysis of Meteorological Data to Assess Evapotranspiration and Infiltration at the Rifle Site, CO, USA

    Science.gov (United States)

    Faybishenko, B.; Long, P. E.; Tokunaga, T. K.; Christensen, J. N.

    2015-12-01

    Net infiltration to the vadose zone, especially in arid or semi-arid climates, is an important control on microbial activity and solute and green house gas fluxes. To assess net infiltration, we performed a statistical analysis of meteorological data as the basis for hydrological and climatic investigations and predictions for the Rifle site, Colorado, USA, located within a floodplain in a mountainous region along the Colorado River, with a semi-arid climate. We carried out a statistical analysis of meteorological 30-year time series data (1985-2015), including: (1) precipitation data, taking into account the evaluation of the snowmelt, (2) evaluation of the evapotranspiration (reference and actual), (3) estimation of the multi-time-scalar Standardized Precipitation-Evapotranspiration Index (SPEI), (4) evaluation of the net infiltration rate, and (5) corroborative analysis of calculated net infiltration rate and groundwater recharge from radioisotopic measurements from samples collected in 2013. We determined that annual net infiltration percentage of precipitation varies from 4.7% to ~18%, with a mean of ~10%, and concluded that calculations of net infiltration based on long-term meteorological data are comparable with those from strontium isotopic investigations. The evaluation of the SPEI showed the intermittent pattern of droughts and wet periods over the past 30 years, with a detectable decreasein the duration of droughts with time. Local measurements within the floodplain indicate a recharge gradient with increased recharge closer to the Colorado River.

  17. A Statistical Method for Assessing Peptide Identification Confidence in Accurate Mass and Time Tag Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Stanley, Jeffrey R.; Adkins, Joshua N.; Slysz, Gordon W.; Monroe, Matthew E.; Purvine, Samuel O.; Karpievitch, Yuliya V.; Anderson, Gordon A.; Smith, Richard D.; Dabney, Alan R.

    2011-07-15

    High-throughput proteomics is rapidly evolving to require high mass measurement accuracy for a variety of different applications. Increased mass measurement accuracy in bottom-up proteomics specifically allows for an improved ability to distinguish and characterize detected MS features, which may in turn be identified by, e.g., matching to entries in a database for both precursor and fragmentation mass identification methods. Many tools exist with which to score the identification of peptides from LC-MS/MS measurements or to assess matches to an accurate mass and time (AMT) tag database, but these two calculations remain distinctly unrelated. Here we present a statistical method, Statistical Tools for AMT tag Confidence (STAC), which extends our previous work incorporating prior probabilities of correct sequence identification from LC-MS/MS, as well as the quality with which LC-MS features match AMT tags, to evaluate peptide identification confidence. Compared to existing tools, we are able to obtain significantly more high-confidence peptide identifications at a given false discovery rate and additionally assign confidence estimates to individual peptide identifications. Freely available software implementations of STAC are available in both command line and as a Windows graphical application.

  18. Proper assessment of the JFK assassination bullet lead evidence from metallurgical and statistical perspectives.

    Science.gov (United States)

    Randich, Erik; Grant, Patrick M

    2006-07-01

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano (MC), 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in MC bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  19. Proper Assessment of the JFK Assassination Bullet Lead Evidence from Metallurgical and Statistical Perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Randich, E; Grant, P M

    2006-08-29

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano, 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in Mannlicher-Carcano bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  20. Robust statistical approaches to assess the degree of agreement of clinical data

    Science.gov (United States)

    Grilo, Luís M.; Grilo, Helena L.

    2016-06-01

    To analyze the blood of patients who took vitamin B12 for a period of time, two different medicine measurement methods were used (one is the established method, with more human intervention, and the other method uses essentially machines). Given the non-normality of the differences between both measurement methods, the limits of agreement are estimated using also a non-parametric approach to assess the degree of agreement of the clinical data. The bootstrap resampling method is applied in order to obtain robust confidence intervals for mean and median of differences. The approaches used are easy to apply, running a friendly software, and their outputs are also easy to interpret. In this case study the results obtained with (non)parametric approaches lead us to different statistical conclusions, but the decision whether agreement is acceptable or not is always a clinical judgment.

  1. Sea cliff instability susceptibility at regional scale: a statistically based assessment in southern Algarve, Portugal

    Directory of Open Access Journals (Sweden)

    F. M. S. F. Marques

    2013-05-01

    along their top. The study was based on the application of the bivariate Information Value and multivariate Logistic regression statistical methods, using a set of predisposing factors for cliff failures, mainly related with geology (lithology, bedding dip, faults and geomorphology (maximum and mean slope, height, aspect, plan curvature, toe protection which were correlated with a photogrammetry based inventory of cliff failures occurred in a 60 yr period (1947–2007. The susceptibility models were validated against the inventory data using standard success rate and ROC curves, and provided encouraging results, indicating that the proposed approaches are effective for susceptibility assessment. The results obtained also stress the need for improvement of the predisposing factors to be used in this type of studies and the need of detailed and systematic cliff failures inventories.

  2. Water Quality Assessment of Gufu River in Three Gorges Reservoir (China Using Multivariable Statistical Methods

    Directory of Open Access Journals (Sweden)

    Jiwen Ge

    2013-07-01

    Full Text Available To provide the reasonable basis for scientific management of water resources and certain directive significance for sustaining health of Gufu River and even maintaining the stability of water ecosystem of the Three-Gorge Reservoir of Yangtze River, central China, multiple statistical methods including Cluster Analysis (CA, Discriminant Analysis (DA and Principal Component Analysis (PCA were performed to assess the spatial-temporal variations and interpret water quality data. The data were obtained during one year (2010~2011 of monitoring of 13 parameters at 21 different sites (3003 observations, Hierarchical CA classified 11 months into 2 periods (the first and second periods and 21 sampling sites into 2 clusters, namely, respectively upper reaches with little anthropogenic interference (UR and lower reaches running through the farming areas and towns that are subjected to some human interference (LR of the sites, based on similarities in the water quality characteristics. Eight significant parameters (total phosphorus, total nitrogen, temperature, nitrate nitrogen, total organic carbon, total hardness, total alkalinity and silicon dioxide were identified by DA, affording 100% correct assignations for temporal variation analysis, and five significant parameters (total phosphorus, total nitrogen, ammonia nitrogen, electrical conductivity and total organic carbon were confirmed with 88% correct assignations for spatial variation analysis. PCA (varimax functionality was applied to identify potential pollution sources based on the two clustered regions. Four Principal Components (PCs with 91.19 and 80.57% total variances were obtained for the Upper Reaches (UR and Lower Reaches (LR regions, respectively. For the UR region, the rainfall runoff, soil erosion, scouring weathering of crustal materials and forest areas are the main sources of pollution. The pollution sources for the LR region are anthropogenic sources (domestic and agricultural runoff

  3. Fighting bias with statistics: Detecting gender differences in responses to items on a preschool science assessment

    Science.gov (United States)

    Greenberg, Ariela Caren

    Differential item functioning (DIF) and differential distractor functioning (DDF) are methods used to screen for item bias (Camilli & Shepard, 1994; Penfield, 2008). Using an applied empirical example, this mixed-methods study examined the congruency and relationship of DIF and DDF methods in screening multiple-choice items. Data for Study I were drawn from item responses of 271 female and 236 male low-income children on a preschool science assessment. Item analyses employed a common statistical approach of the Mantel-Haenszel log-odds ratio (MH-LOR) to detect DIF in dichotomously scored items (Holland & Thayer, 1988), and extended the approach to identify DDF (Penfield, 2008). Findings demonstrated that the using MH-LOR to detect DIF and DDF supported the theoretical relationship that the magnitude and form of DIF and are dependent on the DDF effects, and demonstrated the advantages of studying DIF and DDF in multiple-choice items. A total of 4 items with DIF and DDF and 5 items with only DDF were detected. Study II incorporated an item content review, an important but often overlooked and under-published step of DIF and DDF studies (Camilli & Shepard). Interviews with 25 female and 22 male low-income preschool children and an expert review helped to interpret the DIF and DDF results and their comparison, and determined that a content review process of studied items can reveal reasons for potential item bias that are often congruent with the statistical results. Patterns emerged and are discussed in detail. The quantitative and qualitative analyses were conducted in an applied framework of examining the validity of the preschool science assessment scores for evaluating science programs serving low-income children, however, the techniques can be generalized for use with measures across various disciplines of research.

  4. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    is low. Tests based solely on the distribution of allele frequencies or the site frequency spectrum, such as the Ewens-Watterson test or Tajima's D, have less power in detecting both positive and negative selection because of the transient nature of positive selection and the weak signal left by negative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...

  5. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    Science.gov (United States)

    Porter, Kristin E.

    2016-01-01

    In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…

  6. The Surprising Power of Statistical Learning: When Fragment Knowledge Leads to False Memories of Unheard Words

    Science.gov (United States)

    Endress, Ansgar D.; Mehler, Jacques

    2009-01-01

    Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…

  7. Characterizing Key Developmental Understandings and Pedagogically Powerful Ideas within a Statistical Knowledge for Teaching Framework

    Science.gov (United States)

    Groth, Randall E.

    2013-01-01

    A hypothetical framework to characterize statistical knowledge for teaching (SKT) is described. Empirical grounding for the framework is provided by artifacts from an undergraduate course for prospective teachers that concentrated on the development of SKT. The theoretical notion of "key developmental understanding" (KDU) is used to identify…

  8. Reliability assessment of distribution power systems including distributed generations

    International Nuclear Information System (INIS)

    Nowadays, power systems have reached a good level of reliability. Nevertheless, considering the modifications induced by the connections of small independent producers to distribution networks, there's a need to assess the reliability of these new systems. Distribution networks present several functional characteristics, highlighted by the qualitative study of the failures, as dispersed loads at several places, variable topology and some electrotechnical phenomena which must be taken into account to model the events that can occur. The adopted reliability calculations method is Monte Carlo simulations, the probabilistic method most powerful and most flexible to model complex operating of the distribution system. We devoted a first part on the case of a 20 kV feeder to which a cogeneration unit is connected. The method was applied to a software of stochastic Petri nets simulations. Then a second part related to the study of a low voltage power system supplied by dispersed generations. Here, the complexity of the events required to code the method in an environment of programming allowing the use of power system calculations (load flow, short-circuit, load shedding, management of units powers) in order to analyse the system state for each new event. (author)

  9. Quantitative assessment of aquatic impacts of power plants

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Arnold, E.M.; Skalski, J.R.; Fickeisen, D.H.; Baker, K.S.

    1979-08-01

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers.

  10. Quantitative assessment of aquatic impacts of power plants

    International Nuclear Information System (INIS)

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers

  11. Statistical power to detect change in a mangrove shoreline fish community adjacent to a nuclear power plant.

    Science.gov (United States)

    Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E

    2016-03-01

    An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.

  12. Impact assessment of tornado against nuclear power plant

    International Nuclear Information System (INIS)

    The impact assessment of tornado against nuclear power plants conforms to the 'Assessment guide for tornado effect on nuclear power plants' stipulated by the Nuclear Regulation Authority. In face of the assessment, important items are the setting of the maximum wind speed considered in design, and the setting of a flying object evaluation model, on the basis of observation results. The Japan Society of Maintenology summarized the verification results of the concept on the setting of tornado design and flying object valuation model, the contents of which are explained here. The following are explained: (1) validity of the setting of tornado design in the Assessment Guide, (2) analysis of synoptic field, (3) study on the regional characteristics of tornado occurrence environmental field by means of the analysis of synoptic field and gust associated index, and (4) setting of tornado design based on the above (1)-(3). Next, on the flying object evaluation model, the authors picked up the Rankine vortex model and Fujita model, and verified the reproducibility of the models using the features of each and the actual state of tornado damage. (A.O.)

  13. Statistical and regulatory considerations in assessments of interchangeability of biological drug products.

    Science.gov (United States)

    Tóthfalusi, Lászlo; Endrényi, László; Chow, Shein-Chung

    2014-05-01

    When the patent of a brand-name, marketed drug expires, new, generic products are usually offered. Small-molecule generic and originator drug products are expected to be chemically identical. Their pharmaceutical similarity can be typically assessed by simple regulatory criteria such as the expectation that the 90% confidence interval for the ratio of geometric means of some pharmacokinetic parameters be between 0.80 and 1.25. When such criteria are satisfied, the drug products are generally considered to exhibit therapeutic equivalence. They are then usually interchanged freely within individual patients. Biological drugs are complex proteins, for instance, because of their large size, intricate structure, sensitivity to environmental conditions, difficult manufacturing procedures, and the possibility of immunogenicity. Generic and brand-name biologic products can be expected to show only similarity but not identity in their various features and clinical effects. Consequently, the determination of biosimilarity is also a complicated process which involves assessment of the totality of the evidence for the close similarity of the two products. Moreover, even when biosimilarity has been established, it may not be assumed that the two biosimilar products can be automatically substituted by pharmacists. This generally requires additional, careful considerations. Without declaring interchangeability, a new product could be prescribed, i.e. it is prescribable. However, two products can be automatically substituted only if they are interchangeable. Interchangeability is a statistical term and it means that products can be used in any order in the same patient without considering the treatment history. The concepts of interchangeability and prescribability have been widely discussed in the past but only in relation to small molecule generics. In this paper we apply these concepts to biosimilars and we discuss: definitions of prescribability and interchangeability and

  14. Enhanced statistical tests for GWAS in admixed populations: assessment using African Americans from CARe and a Breast Cancer Consortium.

    Directory of Open Access Journals (Sweden)

    Bogdan Pasaniuc

    2011-04-01

    Full Text Available While genome-wide association studies (GWAS have primarily examined populations of European ancestry, more recent studies often involve additional populations, including admixed populations such as African Americans and Latinos. In admixed populations, linkage disequilibrium (LD exists both at a fine scale in ancestral populations and at a coarse scale (admixture-LD due to chromosomal segments of distinct ancestry. Disease association statistics in admixed populations have previously considered SNP association (LD mapping or admixture association (mapping by admixture-LD, but not both. Here, we introduce a new statistical framework for combining SNP and admixture association in case-control studies, as well as methods for local ancestry-aware imputation. We illustrate the gain in statistical power achieved by these methods by analyzing data of 6,209 unrelated African Americans from the CARe project genotyped on the Affymetrix 6.0 chip, in conjunction with both simulated and real phenotypes, as well as by analyzing the FGFR2 locus using breast cancer GWAS data from 5,761 African-American women. We show that, at typed SNPs, our method yields an 8% increase in statistical power for finding disease risk loci compared to the power achieved by standard methods in case-control studies. At imputed SNPs, we observe an 11% increase in statistical power for mapping disease loci when our local ancestry-aware imputation framework and the new scoring statistic are jointly employed. Finally, we show that our method increases statistical power in regions harboring the causal SNP in the case when the causal SNP is untyped and cannot be imputed. Our methods and our publicly available software are broadly applicable to GWAS in admixed populations.

  15. Preliminary environmental assessment for the satellite power system (SPS)

    Energy Technology Data Exchange (ETDEWEB)

    1978-10-01

    A preliminary assessment of the impact of the Satellite Power System (SPS) on the environment is presented. Information that has appeared in documents referenced herein is integrated and assimilated. The state-of-knowledge as perceived from recently completed DOE-sponsored studies is disclosed, and prospective research and study programs that can advance the state-of-knowledge and provide an expanded data base for use in an assessment planned for 1980 are defined. Alternatives for research that may be implemented in order to achieve this advancement are also discussed in order that a plan can be selected which will be consistent with the fiscal and time constraints on the SPS Environmental Assessment Program. Health and ecological effects of microwave radiation, nonmicrowave effects on health and the environment (terrestrial operations and space operations), effects on the atmosphere, and effects on communications systems are examined in detail. (WHK)

  16. A follow-up power analysis of the statistical tests used in the Journal of Research in Science Teaching

    Science.gov (United States)

    Woolley, Thomas W.; Dawson, George O.

    It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.

  17. Using Saliency-Weighted Disparity Statistics for Objective Visual Comfort Assessment of Stereoscopic Images

    Science.gov (United States)

    Zhang, Wenlan; Luo, Ting; Jiang, Gangyi; Jiang, Qiuping; Ying, Hongwei; Lu, Jing

    2016-06-01

    Visual comfort assessment (VCA) for stereoscopic images is a particularly significant yet challenging task in 3D quality of experience research field. Although the subjective assessment given by human observers is known as the most reliable way to evaluate the experienced visual discomfort, it is time-consuming and non-systematic. Therefore, it is of great importance to develop objective VCA approaches that can faithfully predict the degree of visual discomfort as human beings do. In this paper, a novel two-stage objective VCA framework is proposed. The main contribution of this study is that the important visual attention mechanism of human visual system is incorporated for visual comfort-aware feature extraction. Specifically, in the first stage, we first construct an adaptive 3D visual saliency detection model to derive saliency map of a stereoscopic image, and then a set of saliency-weighted disparity statistics are computed and combined to form a single feature vector to represent a stereoscopic image in terms of visual comfort. In the second stage, a high dimensional feature vector is fused into a single visual comfort score by performing random forest algorithm. Experimental results on two benchmark databases confirm the superior performance of the proposed approach.

  18. No-Reference Image Quality Assessment for ZY3 Imagery in Urban Areas Using Statistical Model

    Science.gov (United States)

    Zhang, Y.; Cui, W. H.; Yang, F.; Wu, Z. C.

    2016-06-01

    More and more high-spatial resolution satellite images are produced with the improvement of satellite technology. However, the quality of images is not always satisfactory for application. Due to the impact of complicated atmospheric conditions and complex radiation transmission process in imaging process the images often suffer deterioration. In order to assess the quality of remote sensing images over urban areas, we proposed a general purpose image quality assessment methods based on feature extraction and machine learning. We use two types of features in multi scales. One is from the shape of histogram the other is from the natural scene statistics based on Generalized Gaussian distribution (GGD). A 20-D feature vector for each scale is extracted and is assumed to capture the RS image quality degradation characteristics. We use SVM to learn to predict image quality scores from these features. In order to do the evaluation, we construct a median scale dataset for training and testing with subjects taking part in to give the human opinions of degraded images. We use ZY3 satellite images over Wuhan area (a city in China) to conduct experiments. Experimental results show the correlation of the predicted scores and the subjective perceptions.

  19. Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.

    2012-08-01

    Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.

  20. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    OpenAIRE

    Wheatley, Spencer; Sovacool, Benjamin; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0....

  1. Air-chemistry "turbulence": power-law scaling and statistical regularity

    OpenAIRE

    Hsu, H.-m.; Lin, C.-Y.; Guenther, A.; J. J. Tribbia; Liu, S. C.

    2011-01-01

    With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3) and aerosol (PM10) measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with ...

  2. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [University of Kansas; Mechem, David [University of Kansas; Ma, Chunsheng [Wichita State University

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  3. The power of 41%: A glimpse into the life of a statistic.

    Science.gov (United States)

    Tanis, Justin

    2016-01-01

    "Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record PMID:27380151

  4. A escolha do teste estatístico - um tutorial em forma de apresentação em PowerPoint A PowerPoint®-based guide to assist in choosing the suitable statistical test

    Directory of Open Access Journals (Sweden)

    David Normando

    2010-02-01

    Full Text Available A seleção de métodos apropriados para a análise estatística pode parecer complexa, principalmente para estudantes de pós-graduação e pesquisadores no início da carreira científica. Por outro lado, a apresentação em PowerPoint é uma ferramenta comum para estudantes e pesquisadores. Assim, um tutorial de Bioestatística desenvolvido em uma apresentação em PowerPoint poderia estreitar a distância entre ortodontistas e a Bioestatística. Esse guia proporciona informações úteis e objetivas a respeito de vários métodos estatísticos empregando exemplos relacionados à Odontologia e, mais especificamente, à Ortodontia. Esse tutorial deve ser empregado, principalmente, para o usuário obter algumas respostas a questões comuns relacionadas ao teste mais apropriado para executar comparações entre grupos, examinar correlações e regressões ou analisar o erro do método. Também pode ser obtido auxílio para checar a distribuição dos dados (normal ou anormal e a escolha do gráfico mais adequado para a apresentação dos resultados. Esse guia* pode ainda ser de bastante utilidade para revisores de periódicos examinarem, de forma rápida, a adequabilidade do método estatístico apresentado em um artigo submetido à publicação.Selecting appropriate methods for statistical analysis may be difficult, especially for the students and others in the early phases of the research career. On the other hand, PowerPoint presentation is a very common tool to researchers and dental students, so a statistical guide based on PowerPoint could narrow the gap between orthodontist and the Biostatistics. This guide provides objective and useful information about several statistical methods using examples related to the dental field. A Power-Point presentation is employed to assist the user to find answers to common questions regarding Biostatistics, such as the most appropriate statistical test to compare groups, to make correlations and

  5. Online Sensor Calibration Assessment in Nuclear Power Systems

    International Nuclear Information System (INIS)

    Safe, efficient, and economic operation of nuclear systems (nuclear power plants, fuel fabrication and storage, used fuel processing, etc.) relies on transmission of accurate and reliable measurements. During operation, sensors degrade due to age, environmental exposure, and maintenance interventions. Sensor degradation can affect the measured and transmitted signals, including sensor failure, signal drift, sensor response time, etc. Currently, periodic sensor recalibration is performed to avoid these problems. Sensor recalibration activities include both calibration assessment and adjustment (if necessary). In nuclear power plants, periodic recalibration of safety-related sensors is required by the plant technical specifications. Recalibration typically occurs during refueling outages (about every 18 to 24 months). Non-safety-related sensors also undergo recalibration, though not as frequently. However, this approach to maintaining sensor calibration and performance is time-consuming and expensive, leading to unnecessary maintenance, increased radiation exposure to maintenance personnel, and potential damage to sensors. Online monitoring (OLM) of sensor performance is a non-invasive approach to assess instrument calibration. OLM can mitigate many of the limitations of the current periodic recalibration practice by providing more frequent assessment of calibration and identifying those sensors that are operating outside of calibration tolerance limits without removing sensors or interrupting operation. This can support extended operating intervals for unfaulted sensors and target recalibration efforts to only degraded sensors

  6. Selection, competency development and assessment of nuclear power plant managers

    International Nuclear Information System (INIS)

    This publication provides information on proven methods and good practices with respect to the selection, development and assessment of nuclear power plant (NPP) managers. The report is organized into four sections, a glossary, two appendices, and several annexes. The Introduction (Section 1) provides the framework for the report. Section 2 describes how appropriate management competencies can be used for the selection, development and assessment of NPP managers, including: -Selection which includes recruitment, promotion and succession management. -Management development programmes including formal training, job rotation, on the job training, mentoring, and outside assignments. -Assessment of individual performance. Section 3 describes a systematic process for identifying the competencies needed by NPP managers. This section culminates in a set of suggested core competencies for NPP managers which are further expanded in Appendix A. The annexes included provide specific examples of competency-based management selection, development, and assessment programmes in several Member States. -Annex A is one method to organize and display competencies. -Annex B is an example of using competencies for selection of first line managers. -Annex C is an example of using management competencies for succession management. -Annexes -H are examples of management development programmes. -Annexes I and J are examples of management assessment programmes. A glossary of terms is provided at the end of the report to explain the use of some key terms explain the use of some key terms

  7. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-11-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  8. Assessment of environmental external effects in power generation

    International Nuclear Information System (INIS)

    This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetize the external effects in the production of energy, especially in relation to renewable technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared to the production of electricity based on a coal-fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas. In the report the individual externalities from the different ways of producing energy are identified, the stress caused by the effect is assessed, and finally the monetary value of the damage is estimated. The method is applied to the local as well as the regional and global externalities. (au) 8 tabs., 7 ills., 4 refs

  9. Quadrennial Technology Review 2015: Technology Assessments--Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2015-10-07

    Wind power has become a mainstream power source in the U.S. electricity portfolio, supplying 4.9% of the nation’s electricity demand in 2014. With more than 65 GW installed across 39 states at the end of 2014, utility-scale wind power is a cost-effective source of low-emissions power generation throughout much of the nation. The United States has significant sustainable land-based and offshore wind resource potential, greater than 10 times current total U.S. electricity consumption. A technical wind resource assessment conducted by the Department of Energy (DOE) in 2009 estimated that the land-based wind energy potential for the contiguous United States is equivalent to 10,500 GW capacity at 80 meters (m) hub and 12,000 GW capacity at 100 meters (m) hub heights, assuming a capacity factor of at least 30%. A subsequent 2010 DOE report estimated the technical offshore wind energy potential to be 4,150 GW. The estimate was calculated from the total offshore area within 50 nautical miles of shore in areas where average annual wind speeds are at least 7 m per second at a hub height of 90 m.

  10. OVERVIEW OF ENVIRONMENTAL ASSESSMENT FOR CHINA NUCLEAR POWER INDUSTRY AND COAL—FIRED POWER INDUSTRY

    Institute of Scientific and Technical Information of China (English)

    张少华; 潘自强; 等

    1994-01-01

    A quantitative environmental assessment method and the corresponding computer code are introduced in this paper.By the consideration of all fuel cycle steps,it gives that the public health risk of China nuclear power industry is 5.2×10-1man/(GW.a),the occupational health risk is 2.5man/(GW.a).and the total health risk is 3.0man/(GW.a0.After the health risk calculation for coal mining,transport,burning up and ash disposal,it gives that the public health risk of China cola-fired power industry is 3.6man/(GW.a).the occupational health risk is 50man/(GW.a),and the total is 54man/(GW.a),Accordingly,the conclusion that China nuclear power industry is an industry with high safety and cleanness is derived at the end.

  11. Overview of environmental assessment for China nuclear power industry and coal-fired power industry

    International Nuclear Information System (INIS)

    A quantitative environmental assessment method and the corresponding computer code are introduced. By the consideration of all fuel cycle steps, it given that the public health risk of China nuclear power industry is 5.2 x 10-1 man/(GW·a) the public health risk is 2.5 man/(GW·a), and the total health risk is 3.0 man/(GW·a). After the health risk calculation for coal mining, transport, burning up and ash disposal, it gives that the public health risk of China coal-fired power industry is 3.6 man/(GW·a), the occupational health risk is 50 man/(GW·a), and the total is 54 man/(GW·). Accordingly, the conclusion that China nuclear power industry is one with high safety and cleanness is derived at the end

  12. Planck 2013 results. XXI. All-sky Compton parameter power spectrum and high-order statistics

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Genova-Santos, R.T.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. These maps show an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales ($\\ell 500$) the clustered Cosmic Infrared Background (CIB) and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We measure the tSZ power spectrum in angular scales, $0.17^{\\circ} \\lesssim \\theta \\lesssim 3.0^{\\circ}$, that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with additional clear evidence of signal from unresolved clusters and, potentially, diffuse warm baryons. We use the tSZ power spectrum to ...

  13. Statistical Analysis of Power Production from OWC Type Wave Energy Converters

    DEFF Research Database (Denmark)

    Martinelli, L.; Zanuttigh, B.; Kofoed, Jens Peter

    2009-01-01

    , into a unidirectional flow, making the use of more efficient air turbines possible. Hereby, a more steady flow is also obtained. The general objective of this note is to examine, the power take off (PTO) efficiency under irregular wave conditions, for WECs with flow redirection. Final practical aim is to identify...

  14. Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Lee, Andrew K. [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Sahoo, Narayan [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tucker, Susan L. [Department of Bioinformatics and Computational Biology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei, E-mail: dong.lei@scrippshealth.org [Scripps Proton Therapy Center, San Diego, California (United States)

    2013-08-01

    Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dose–volume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dose–volume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of −1.1% (−0.9% for breath-hold), −0.3%, and −2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.

  15. ASSESSMENT OF OIL PALM PLANTATION AND TROPICAL PEAT SWAMP FOREST WATER QUALITY BY MULTIVARIATE STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Seca Gandaseca

    2014-01-01

    Full Text Available This study reports the spatio-temporal changes in river and canal water quality of peat swamp forest and oil palm plantation sites of Sarawak, Malaysia. To investigate temporal changes, 192 water samples were collected at four stations of BatangIgan, an oil palm plantation site of Sarawak, during July-November in 2009 and April-July in 2010. Nine water quality parameters including Electrical Conductivity (EC, pH, Turbidity (TER, Dissolved Oxygen (DO, Temperature (TEMP, Chemical Oxygen Demand (COD, five-day Biochemical Oxygen Demand (BOD5, ammonia-Nitrogen (NH3-N, Total Suspended Solids (TSS were analysed. To investigate spatial changes, 432water samples were collected from six different sites including BatangIgan during June-August 2010. Six water quality parameters including pH, DO, COD, BOD5, NH3-N and TSS were analysed to see the spatial variations. Most significant parameters which contributed in spatio-temporal variations were assessed by statistical techniques such as Hierarchical Agglomerative Cluster Analysis (HACA, Factor Analysis/Principal Components Analysis (FA/PCA and Discriminant Function Analysis (DFA. HACA identified three different classes of sites: Relatively Unimpaired, Impaired and Less Impaired Regions on the basis of similarity among different physicochemical characteristics and pollutant level between the sampling sites. DFA produced the best results for identification of main variables for temporal analysis and separated parameters (EC, TER, COD and identified three parameters for spatial analysis (pH, NH3-N and BOD5. The results signify that parameters identified by statistical analyses were responsible for water quality change and suggest the possibility the agricultural and oil palm plantation activities as a source of pollutants. The results suggest dire need for proper watershed management measures to restore the water quality of this tributary for a

  16. Security assessment for intentional island operation in modern power system

    DEFF Research Database (Denmark)

    Chen, Yu; Xu, Zhao; Østergaard, Jacob

    2011-01-01

    operator can clearly know if it is suitable to conduct island operation at one specific moment. Besides, in order to improve the computation efficiency, the Artificial Neural Network (ANN) is applied for fast ISR formation. Thus, online application of ISR based islanding security assessment could......There has been a high penetration level of Distributed Generations (DGs) in distribution systems in Denmark. Even more DGs are expected to be installed in the coming years. With that, to utilize them in maintaining the security of power supply is of great concern for Danish utilities. During...... the emergency in the power system, some distribution networks may be intentionally separated from the main grid to avoid complete system collapse. If DGs in those networks could continuously run instead of immediately being shut down, the blackout could be avoided and the reliability of supply could...

  17. Wide Area Measurement Based Security Assessment & Monitoring of Modern Power System: A Danish Power System Case Study

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2013-01-01

    monitoring of modern power system with large scale renewable energy penetration. Phasor measurement unit (PMU) based WAMS has been implemented in western Danish Power System to realize online security monitoring and assessment in power system control center. The proposed security monitoring system has been......Power System security has become a major concern across the global power system community. This paper presents wide area measurement system (WAMS) based security assessment and monitoring of modern power system. A new three dimensional security index (TDSI) has been proposed for online security...

  18. POWER LOSSES ASSESSMENT IN TRANSFORMERS AFTER THE NORMATIVE OPERATING PERIOD

    Directory of Open Access Journals (Sweden)

    M. I. Fursanov

    2015-01-01

    Full Text Available The capacity losses values both loading and off-load are topmost parameters characterizing the distribution mains customers’ transformers operating effectiveness. Precise determination of the specified values facilitates substantiated choice of the optimizing procedures. The actuality of the given topic increases owing to the fact that the modern electric grid utilizes plenty of the oil-transformers whose time in commission considerably exceeds the statutory 25 years. Under the conditions of continued operation the power-losses measurement according to the functioning guidelines does not seem always possible.The authors present an improved power-losses assessment technique based on the currently accepted thermal model of the oil-transformer. They indicate the deficiency of the existing technique and substantiate some of the changes in practical application of the mathematical model. The article makes an emphasis on peculiarities of the temperature changes in the oil-transformer and offers a prototype device of open architecture for realizing the improved technique of the power-losses measurement. The paper describes the device design features and functionality options and depicts its sketchy schematic. The authors note the potential of additional to assessing the power-losses volume, transmitting the obtained information to the dispatcher  via  GSM-connection  for  simplification  of  the  transformer  status  monitoring; as well as the capability of integrating the device into the system of the transformer thermal protection. The practical merit and application scope of the obtained results are in development and choice of the optimizing measures to be taken in the distributive electrical grids, e. g. the transformer replacement.

  19. Assessment of control rooms of nuclear power plants

    International Nuclear Information System (INIS)

    To identify and correct the lacks in control rooms of operating power plants and plants under construction an extensive program has been started in the USA. In Finland as in other countries using nuclear power, the development in the USA particularly with regard to the requirements imposed on nuclear power plants is carefully followed. The changes in these requirements are sooner or later also reflected in the guidelines given by the Finnish authorities. It is therefore important to be able to form a notion of how the new requirements apply to Finnish conditions. Especially it is important to review the latest assessment guidelines for control room implementation (NUREG-0700). Thus we can avoid possible over hasty conclusions. The aim of the analysis of the method and experiments presented in NUREG 0700 report was to create a basis for assessment of the suitability of the method for Finnish control room implementation. The task group has made a general methodical analysis of the method, and partly tried it in assessment of the TVO2 control room. It is obvious that direct conclusions from the American situation are misleading. It can be considered unfeasible to follow the American requirements as such, because they can lead to unwanted results. If the review is limited to control room details, the NRC program (checklist) can be considered successful. It can also be used during planning to observation of small discrepancies. However, we can question the applicability of some requirements. It is, though, more essential that the control room entity has neither in this nor in several other programs been reached or standardized. In spite of the difficulties we should try to reach this most important goal. (author)

  20. On-line Dynamic Security Assessment in Power Systems

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel

    to identify the critical group of generators. In order to determine a system to be transient secure, it is not sufficient to solely assess if all synchronous generator remain in synchronism, it is also required that the bus voltages remain within acceptable limits. A transient disturbance and the following......-depth study of the mechanism causing the voltage sags. The first sensitivity type is called load voltage i/xii sensitivity and allows identifying which bus voltages are affected by a change in rotor angle of a particular generator. The second proposed type is called generator power sensitivity, which provides...

  1. Assessment of Problem-Based Learning in the Undergraduate Statistics Course

    Science.gov (United States)

    Karpiak, Christie P.

    2011-01-01

    Undergraduate psychology majors (N = 51) at a mid-sized private university took a statistics examination on the first day of the research methods course, a course for which a grade of "C" or higher in statistics is a prerequisite. Students who had taken a problem-based learning (PBL) section of the statistics course (n = 15) were compared to those…

  2. Power plant system assessment. Final report. SP-100 Program

    International Nuclear Information System (INIS)

    The purpose of this assessment was to provide system-level insights into 100-kWe-class space reactor electric systems. Using these insights, Rockwell was to select and perform conceptual design studies on a ''most attractive'' system that met the preliminary design goals and requirements of the SP-100 Program. About 4 of the 6 months were used in the selection process. The remaining 2 months were used for the system conceptual design studies. Rockwell completed these studies at the end of FY 1983. This report summarizes the results of the power plant system assessment and describes our choice for the most attractive system - the Rockwell SR-100G System (Space Reactor, 100 kWe, Growth) - a lithium-cooled UN-fueled fast reactor/Brayton turboelectric converter system

  3. A subsampling approach to estimating the distribution of diversing statistics with application to assessing financial market risks

    OpenAIRE

    Bertail, Patrice; Haefke, Christian; Politis, Dimitris N.; White, Halbert

    2001-01-01

    In this paper we propose a subsampling estimator for the distribution of statistics diverging at either known rates when the underlying time series in strictly stationary abd strong mixing. Based on our results we provide a detailed discussion how to estimate extreme order statistics with dependent data and present two applications to assessing financial market risk. Our method performs well in estimating Value at Risk and provides a superior alternative to Hill's estimator ...

  4. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    Science.gov (United States)

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  5. Measuring aerobic cycling power as an assessment of childhood fitness.

    Science.gov (United States)

    Carrel, Aaron L; Sledge, Jeffrey S; Ventura, Steve J; Clark, R Randall; Peterson, Susan E; Eickhoff, Jens C; Allen, David B

    2008-01-01

    The emergence of obesity, insulin resistance, and type 2 diabetes in children requires a rational, effective public health response. Physical activity remains an important component of prevention and treatment for obesity, type 2 diabetes, and insulin resistance. Studies in adults show cardiovascular fitness to be more important than obesity in predicting insulin resistance. We recently demonstrated that a school-based fitness intervention in children who are overweight could improve cardiovascular fitness, body composition, and insulin sensitivity, but it remains unclear whether accurate assessment of fitness could be performed at the school or outside of an exercise laboratory. To determine whether new methodology using measurement of cycling power could estimate cardiovascular aerobic fitness (as defined by VO2max) in middle school children who were overweight. Thirty-five middle school children (mean age 12 +/- 0.4 years) who were overweight underwent testing on a power sensor-equipped Cycle Ops indoor cycle (Saris Cycling Group, Fitchburg, WI) as well as body composition by dual x-ray absorptiometry and VO2max by treadmill determination. Insulin sensitivity was also estimated by fasting glucose and insulin. Maximal heart rate (MHR) was determined during VO2max testing, and power produced at 80%MHR was recorded. Spearman's rank correlation was performed to evaluate associations. Mean power determined on the indoor cycle at 80% of MHR was 129 +/- 77 watts, and average power at 80% MHR divided by total body weight was 1.5 +/- 0.5. A significant correlation between watts and total body weight was seen for VO2max (P = 0.03), and significant negative correlation was seen between watts/total body weight and fasting insulin (P effort than laboratory-based measurements. PMID:18296974

  6. A Framework for Assessing the Commercialization of Photovoltaic Power Generation

    Science.gov (United States)

    Yaqub, Mahdi

    An effective framework does not currently exist with which to assess the viability of commercializing photovoltaic (PV) power generation in the US energy market. Adopting a new technology, such as utility-scale PV power generation, requires a commercialization assessment framework. The framework developed here assesses the economic viability of a set of alternatives of identified factors. Economic viability focuses on simulating the levelized cost of electricity (LCOE) as a key performance measure to realize `grid parity', or the equivalence between the PV electricity prices and grid electricity prices for established energy technologies. Simulation results confirm that `grid parity' could be achieved without the current federal 30% investment tax credit (ITC) via a combination of three strategies: 1) using economies of scale to reduce the LCOE by 30% from its current value of 3.6 cents/kWh to 2.5 cents/kWh, 2) employing a longer power purchase agreement (PPA) over 30 years at a 4% interest rate, and 3) improving by 15% the "capacity factor", which is the ratio of the total annual generated energy to the full potential annual generation when the utility is continuously operating at its rated output. The lower than commercial-market interest rate of 4% that is needed to realize `grid parity' is intended to replace the current federal 30% ITC subsidy, which does not have a cash inflow to offset the outflow of subsidy payments. The 4% interest rate can be realized through two proposed finance plans: The first plan involves the implementation of carbon fees on polluting power plants to produce the capital needed to lower the utility PPA loan term interest rate from its current 7% to the necessary 4% rate. The second plan entails a proposed public debt finance plan. Under this plan, the US Government leverages its guarantee power to issue bonds and uses the proceeds to finance the construction and operation of PV power plants with PPA loan with a 4% interest rate for a

  7. Selection for Environmental Variation: a Statistical Analysis and Power Calculations to Detect Response

    DEFF Research Database (Denmark)

    Ibáñez-Escriche, Noelia; Sorensen, Daniel; Waagepetersen, Rasmus;

    2008-01-01

    (HET model) was compared. Various methods to assess the quality of fit favour the HET model. The posterior mean (95% posterior interval)of the additive genetic variance affecting the environmental variance was 0.16(0.10;0.25) and the corresponding figure for the coefficient of correlation between genes...

  8. Climatic change of summer temperature and precipitation in the Alpine region - a statistical-dynamical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Heimann, D.; Sept, V.

    1998-12-01

    Climatic changes in the Alpine region due to increasing greenhouse gas concentrations are assessed by using statistical-dynamical downscaling. The downscaling procedure is applied to two 30-year periods (1971-2000 and 2071-2100, summer months only) of the output of a transient coupled ocean/atmosphere climate scenario simulation. The downscaling results for the present-day climate are in sufficient agreement with observations. The estimated regional climate change during the next 100 years shows a general warming. The mean summer temperatures increase by about 3 to more than 5 Kelvin. The most intense climatic warming is predicted in the western parts of the Alps. The amount of summer precipitation decreases in most parts of central Europe by more than 20 percent. Only over the Adriatic area and parts of eastern central Europe an increase in precipitation is simulated. The results are compared with observed trends and results of regional climate change simulations of other authors. The observed trends and the majority of the simulated trends agree with our results. However, there are also climate change estimates which completely contradict with ours. (orig.) 29 refs.

  9. Assessing pneumococcal meningitis association with viral respiratory infections and antibiotics: insights from statistical and mathematical models.

    Science.gov (United States)

    Opatowski, Lulla; Varon, Emmanuelle; Dupont, Claire; Temime, Laura; van der Werf, Sylvie; Gutmann, Laurent; Boëlle, Pierre-Yves; Watier, Laurence; Guillemot, Didier

    2013-08-01

    Pneumococcus is an important human pathogen, highly antibiotic resistant and a major cause of bacterial meningitis worldwide. Better prevention requires understanding the drivers of pneumococcal infection incidence and antibiotic susceptibility. Although respiratory viruses (including influenza) have been suggested to influence pneumococcal infections, the underlying mechanisms are still unknown, and viruses are rarely considered when studying pneumococcus epidemiology. Here, we propose a novel mathematical model to examine hypothetical relationships between Streptococcus pneumoniae meningitis incidence (SPMI), acute viral respiratory infections (AVRIs) and antibiotic exposure. French time series of SPMI, AVRI and penicillin consumption over 2001-2004 are analysed and used to assess four distinct virus-bacteria interaction submodels, ascribing the interaction on pneumococcus transmissibility and/or pathogenicity. The statistical analysis reveals strong associations between time series: SPMI increases shortly after AVRI incidence and decreases overall as the antibiotic-prescription rate rises. Model simulations require a combined impact of AVRI on both pneumococcal transmissibility (up to 1.3-fold increase at the population level) and pathogenicity (up to threefold increase) to reproduce the data accurately, along with diminished epidemic fitness of resistant pneumococcal strains causing meningitis (0.97 (0.96-0.97)). Overall, our findings suggest that AVRI and antibiotics strongly influence SPMI trends. Consequently, vaccination protecting against respiratory virus could have unexpected benefits to limit invasive pneumococcal infections.

  10. Blind image quality assessment: a natural scene statistics approach in the DCT domain.

    Science.gov (United States)

    Saad, Michele A; Bovik, Alan C; Charrier, Christophe

    2012-08-01

    We develop an efficient, general-purpose, blind/noreference image quality assessment (NR-IQA) algorithm using a natural scene statistics (NSS) model of discrete cosine transform (DCT) coefficients. The algorithm is computationally appealing, given the availability of platforms optimized for DCT computation. The approach relies on a simple Bayesian inference model to predict image quality scores given certain extracted features. The features are based on an NSS model of the image DCT coefficients. The estimated parameters of the model are utilized to form features that are indicative of perceptual quality. These features are used in a simple Bayesian inference approach to predict quality scores. The resulting algorithm, which we name BLIINDS-II, requires minimal training and adopts a simple probabilistic model for score prediction. Given the extracted features from a test image, the quality score that maximizes the probability of the empirically determined inference model is chosen as the predicted quality score of that image. When tested on the LIVE IQA database, BLIINDS-II is shown to correlate highly with human judgments of quality, at a level that is competitive with the popular SSIM index.

  11. Multivariate statistical approach for the assessment of groundwater quality in Ujjain City, India.

    Science.gov (United States)

    Vishwakarma, Vikas; Thakur, Lokendra Singh

    2012-10-01

    Groundwater quality assessment is an essential study which plays important role in the rational development and utilization of groundwater. Groundwater quality greatly influences the health of local people. The variations of water quality are essentially the combination of both anthropogenic and natural contributions. In order to understand the underlying physical and chemical processes this study analyzes 8 chemical and physical-chemical water quality parameters, viz. pH, turbidity, electrical conductivity, total dissolved solids, total alkalinity, total hardness, chloride and fluoride recorded at the 54 sampling stations during summer season of 2011 by using multivariate statistical techniques. Hierarchical clustering analysis (CA) is first applied to distinguish groundwater quality patterns among the stations, followed by the use of principle component analysis (PCA) and factor analysis (FA) to extract and recognize the major underlying factors contributing to the variations among the water quality measures. The first three components were chosen for interpretation of the data, which accounts for 72.502% of the total variance in the data set. The maximum number of variables, i.e. turbidity, EC, TDS and chloride were characterized by first component, while second and third were characterized by total alkalinity, total hardness, fluoride and pH respectively. This shows that hydro chemical constituents of the groundwater are mainly controlled by EC, TDS, and fluoride. The findings of the cluster analysis are presented in the form of dendrogram of the sampling stations (cases) as well as hydro chemical variables, which produced four major groupings, suggest that groundwater monitoring can be consolidated.

  12. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  13. Does bisphenol A induce superfeminization in Marisa cornuarietis? Part II: toxicity test results and requirements for statistical power analyses.

    Science.gov (United States)

    Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert

    2007-03-01

    This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.

  14. Use of Non-Parametric Statistical Method in Identifying Repetitive High Dose Jobs in a Nuclear Power Plant

    International Nuclear Information System (INIS)

    The cost-effective reduction of occupational radiation dose (ORD) at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORD data of existing plants. Through the data analysis, it is required to identify what are the jobs of repetitive high ORD at the nuclear power plant. In this study, Percentile Rank Sum Method (PRSM) is proposed to identify repetitive high ORD jobs, which is based on non-parametric statistical theory. As a case study, the method is applied to ORD data of maintenance and repair jobs at Kori units 3 and 4 that are pressurized water reactors with 950 MWe capacity and have been operated since 1986 and 1987, respectively in Korea. The results was verified and validated, and PRSM has been demonstrated to be an efficient method of analyzing the data.

  15. Assessment of electrical equipment aging for nuclear power plant

    International Nuclear Information System (INIS)

    The electrical and instrumentation equipments, especially whose parts are made of polymer material, are gradually degraded by thermal and radiation environment in the normal operation, and the degradation is thought to progress rapidly when they are exposed to the environment of the design basis event (DBE). The integrity of the equipments is evaluated by the environmental qualification (EQ) test simulating the environment of the normal operation and the DBE. The project of 'Assessment of Cable Aging for Nuclear Power Plants' (ACA, 2002-2008) indicated the importance of applying simultaneous thermal and radiation aging for simulating the aging in normal operation. The project of 'Assessment of Electrical Equipment Aging for Nuclear Power Plants' (AEA) was initiated in FY2008 to apply the outcome of ACA to the other electrical and instrumentation equipment and to establish an advanced EQ test method that can appropriately simulate the environment in actual plants. In FY2012, aging characteristics of thermal aging and simultaneous aging were obtained for the epoxy resin of electrical penetrations and the O-ring of connectors. Physical property measurement was carried out for epoxy resin of electrical penetration subject to the type testing in FY2010. (author)

  16. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd;

    2009-01-01

    the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform on...

  17. Safety Assessment of Nuclear Power Plants for Liquefaction Consequences

    Directory of Open Access Journals (Sweden)

    Tamás János Katona

    2015-01-01

    Full Text Available In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be analysed as beyond design basis hazard. The aim of the analysis is to define the postevent condition of the plant, definition of plant vulnerabilities, and identification of the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The procedure includes identification of the scope of the safety analysis and the acceptable limit cases for plant structures having different role from accident management point of view. Considerations are made for identification of dominating effects of liquefaction. The possibility of the decoupling of the analysis of liquefaction effects from the analysis of vibratory ground motion is discussed. It is shown in the paper that the practicable empirical methods for definition of liquefaction susceptibility provide rather controversial results. Selection of method for assessment of soil behaviour that affects the integrity of structures requires specific considerations. The case of nuclear power plant at Paks, Hungary, is used as an example for demonstration of practical importance of the presented results and considerations.

  18. Aging assessment of surge protective devices in nuclear power plants

    International Nuclear Information System (INIS)

    An assessment was performed to determine the effects of aging on the performance and availability of surge protective devices (SPDs), used in electrical power and control systems in nuclear power plants. Although SPDs have not been classified as safety-related, they are risk-important because they can minimize the initiating event frequencies associated with loss of offsite power and reactor trips. Conversely, their failure due to age might cause some of those initiating events, e.g., through short circuit failure modes, or by allowing deterioration of the safety-related component(s) they are protecting from overvoltages, perhaps preventing a reactor trip, from an open circuit failure mode. From the data evaluated during 1980--1994, it was found that failures of surge arresters and suppressers by short circuits were neither a significant risk nor safety concern, and there were no failures of surge suppressers preventing a reactor trip. Simulations, using the ElectroMagnetic Transients Program (EMTP) were performed to determine the adequacy of high voltage surge arresters

  19. Aging assessment of surge protective devices in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.F.; Subudhi, M. [Brookhaven National Lab., Upton, NY (United States); Carroll, D.P. [Florida Univ., Gainesville, FL (United States)

    1996-01-01

    An assessment was performed to determine the effects of aging on the performance and availability of surge protective devices (SPDs), used in electrical power and control systems in nuclear power plants. Although SPDs have not been classified as safety-related, they are risk-important because they can minimize the initiating event frequencies associated with loss of offsite power and reactor trips. Conversely, their failure due to age might cause some of those initiating events, e.g., through short circuit failure modes, or by allowing deterioration of the safety-related component(s) they are protecting from overvoltages, perhaps preventing a reactor trip, from an open circuit failure mode. From the data evaluated during 1980--1994, it was found that failures of surge arresters and suppressers by short circuits were neither a significant risk nor safety concern, and there were no failures of surge suppressers preventing a reactor trip. Simulations, using the ElectroMagnetic Transients Program (EMTP) were performed to determine the adequacy of high voltage surge arresters.

  20. Modular power system topology assessment using Gaussian potential functions

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Lagos, F.; Joya, G.; Sandoval, F. [Universidad de Malaga, ETSI Telecomunicacion (Spain). Dpto. Tecnologia Electronica; Marin, F.J. [Universidad de Malaga, ETSI Informatica (Spain). Dpto. Electronica

    2003-09-01

    A topology assessment system for power systems using active power measurements as input data is presented. The method is designed to be incorporated into a state estimator working with a bus-branch orientated network model. The system architecture contains two states: (i) the preprocessing state; and (ii) the classification stage. The preprocessing stage transforms each current measurement set to produce a vector in the [0.1]{sup n} space. This stage produces clusters of very similar preprocessing output vectors for each grid topology. The classification stage consists in a layer of Gaussian potential units with Mahalanobis distance, and classifies the preprocessing output vectors to identify the actual topology. The main features of this method are: (i) local topology identification; (ii) linear growth of the complexity with the power system size; (iii) correction of multiple errors; and (iv) insensitivity to bad data. Tests have been carried out using the IEEE 14, 30, 57, 118 and 300 standard networks and different topological and measurement configurations. These tests have demonstrated the successful application of the technique. (Author)

  1. Preliminary environmental assessment for the Satellite Power System (SPS). Revision 1. Volume 2. Detailed assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The Department of Energy (DOE) is considering several options for generating electrical power to meet future energy needs. The satellite power system (SPS), one of these options, would collect solar energy through a system of satellites in space and transfer this energy to earth. A reference system has been described that would convert the energy to microwaves and transmit the microwave energy via directive antennas to large receiving/rectifying antennas (rectennas) located on the earth. At the rectennas, the microwave energy would be converted into electricity. The potential environmental impacts of constructing and operating the satellite power system are being assessed as a part of the Department of Energy's SPS Concept Development and Evaluation Program. This report is Revision I of the Preliminary Environmental Assessment for the Satellite Power System published in October 1978. It refines and extends the 1978 assessment and provides a basis for a 1980 revision that will guide and support DOE recommendations regarding future SPS development. This is Volume 2 of two volumes. It contains the technical detail suitable for peer review and integrates information appearing in documents referenced herein. The key environmental issues associated with the SPS concern human health and safety, ecosystems, climate, and electromagnetic systems interactions. In order to address these issues in an organized manner, five tasks are reported: (I) microwave-radiation health and ecological effects; (II) nonmicrowave health and ecological effectss; (III) atmospheric effects; (IV) effects on communication systems due to ionospheric disturbance; and (V) electromagnetic compatibility. (WHK)

  2. Statistical modelling and power analysis for detecting trends in total suspended sediment loads

    Science.gov (United States)

    Wang, You-Gan; Wang, Shen S. J.; Dunlop, Jason

    2015-01-01

    The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature.

  3. Fast fMRI provides high statistical power in the analysis of epileptic networks.

    Science.gov (United States)

    Jacobs, Julia; Stich, Julia; Zahneisen, Benjamin; Assländer, Jakob; Ramantani, Georgia; Schulze-Bonhage, Andreas; Korinthenberg, Rudolph; Hennig, Jürgen; LeVan, Pierre

    2014-03-01

    EEG-fMRI is a unique method to combine the high temporal resolution of EEG with the high spatial resolution of MRI to study generators of intrinsic brain signals such as sleep grapho-elements or epileptic spikes. While the standard EPI sequence in fMRI experiments has a temporal resolution of around 2.5-3s a newly established fast fMRI sequence called MREG (Magnetic-Resonance-Encephalography) provides a temporal resolution of around 100ms. This technical novelty promises to improve statistics, facilitate correction of physiological artifacts and improve the understanding of epileptic networks in fMRI. The present study compares simultaneous EEG-EPI and EEG-MREG analyzing epileptic spikes to determine the yield of fast MRI in the analysis of intrinsic brain signals. Patients with frequent interictal spikes (>3/20min) underwent EEG-MREG and EEG-EPI (3T, 20min each, voxel size 3×3×3mm, EPI TR=2.61s, MREG TR=0.1s). Timings of the spikes were used in an event-related analysis to generate activation maps of t-statistics. (FMRISTAT, |t|>3.5, cluster size: 7 voxels, p<0.05 corrected). For both sequences, the amplitude and location of significant BOLD activations were compared with the spike topography. 13 patients were recorded and 33 different spike types could be analyzed. Peak T-values were significantly higher in MREG than in EPI (p<0.0001). Positive BOLD effects correlating with the spike topography were found in 8/29 spike types using the EPI and in 22/33 spikes types using the MREG sequence. Negative BOLD responses in the default mode network could be observed in 3/29 spike types with the EPI and in 19/33 with the MREG sequence. With the latter method, BOLD changes were observed even when few spikes occurred during the investigation. Simultaneous EEG-MREG thus is possible with good EEG quality and shows higher sensitivity in regard to the localization of spike-related BOLD responses than EEG-EPI. The development of new methods of analysis for this sequence such as

  4. Techno-economic assessment of thorium power in Canada

    International Nuclear Information System (INIS)

    Highlights: • Costs of replacing uranium in Canada’s nuclear reactors with thorium evaluated. • Results show a thorium plant to be more financially lucrative than a uranium plant. • Results were most sensitive to electricity price, then capital and decommissioning cost. • Abatement cost analysis showed nuclear power offers cost savings over fossil fuels. - Abstract: Thorium fission is a large yet relatively unexplored renewable energy source and could help feed increasing energy demands. An analysis was performed on the feasibility of replacing the uranium in Canada’s nuclear reactors with thorium. Thorium only exists as a fertile isotope, and so an external fissile source such as 235U, 233U, or 239Pu is required to stimulate the fission process. A uranium plant and a similar thorium-fuelled plant were compared over a 40 year operational life based on a comprehensive economic analysis. The results from the economic analysis were used to estimate the greenhouse gas (GHG) abatement cost compared to the coal and natural gas-based power. The economic analysis determined that a thorium plant is more financially lucrative in Canada than a uranium plant. An abatement cost assessment in relation to gas-fired and coal-fired power plants demonstrated that nuclear power offers a cost savings per tonne of CO2 equivalent greenhouse gas (GHG) when compared to both fossil fuel alternatives. From the values determined for a plant potentially fuelled on thorium, the abatement cost when compared to the coal-fired and gas-fired plants is −$10.4/tonne-CO2eq and −$15.7/tonne-CO2eq, respectively

  5. Use assessment of electronic power sources for SMAW

    Directory of Open Access Journals (Sweden)

    Scotti, A.

    1999-04-01

    Full Text Available The aim of the present work was to assess the efficacy of the use of modern technologies for power supplies in Shielded Metal Are Welding (SMAW. Coupon tests were welded by using a series of five different classes of commercial electrodes, covering their current ranges. Both a conventional electromagnetic and an electronic (inverter power sources were employed. Fusion rate, deposition efficiency, bead finish and weld geometry were measured at each experiment. Current and voltage signals were acquired at a high rate to evaluate the dynamic behavior of the power sources. The static performances of both power sources were also determined. The results showed that despite the remarkable differences between the power supplies, based on static and dynamic characterizations, no significant difference was noticed in the operational behavior of the electrodes, in the given conditions, apart from a better anti-stick performance obtained with the electronic power source.

    El objetivo del presente trabajo fue evaluar la eficacia del uso de tecnologías modernas para fuentes de energía en soldaduras con electrodo revestido (Shielded Metal Are Welding -SMAW-. Los materiales de ensayo se soldaron usando una serie de cinco clases diferentes de electrodos comerciales, cubriendo sus rangos de corriente. Para esto se utilizó una fuente de energía electromagnética convencional y una fuente de energía electrónica (inversora. La tasa de fusión, eficiencia de deposición, terminación del cordón así como el diseño de la soldadura se midieron en cada experimento. Las señales de corriente y voltaje se obtuvieron a una proporción alta para evaluar el comportamiento dinámico de las fuentes de energía. También se determinó la actuación estática de ambas fuentes. Los resultados mostraron que a pesar de las diferencias notables entre los suministros de energía, no se nota diferencia alguna significante en la conducta de trabajo de los electrodos, en

  6. The Power (Law) of Indian Markets: Analysing NSE and BSE trading statistics

    CERN Document Server

    Sinha, S; Sinha, Sitabhra; Pan, Raj Kumar

    2006-01-01

    The nature of fluctuations in the Indian financial market is analyzed in this paper. We have looked at the price returns of individual stocks, with tick-by-tick data from the National Stock Exchange (NSE) and daily closing price data from both NSE and the Bombay Stock Exchange (BSE), the two largest exchanges in India. We find that the price returns in Indian markets follow a fat-tailed cumulative distribution, consistent with a power law having exponent $\\alpha \\sim 3$, similar to that observed in developed markets. However, the distributions of trading volume and the number of trades have a different nature than that seen in the New York Stock Exchange (NYSE). Further, the price movement of different stocks are highly correlated in Indian markets.

  7. Intrinsic Variability and Field Statistics for the Vela Pulsar: 3. Two-Component Fits and Detailed Assessment of Stochastic Growth Theory

    OpenAIRE

    Cairns, Iver H.; Das, P; P A Robinson; Johnston, S

    2003-01-01

    The variability of the Vela pulsar (PSR B0833-45) corresponds to well-defined field statistics that vary with pulsar phase, ranging from Gaussian intensity statistics off-pulse to approximately power-law statistics in a transition region and then lognormal statistics on-pulse, excluding giant micropulses. These data are analyzed here in terms of two superposed wave populations, using a new calculation for the amplitude statistics of two vectorially-combined transverse fields. Detailed analyse...

  8. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    CERN Document Server

    Wheatley, Spencer; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...

  9. Wind power prognosis statistical system; Sistema estadistico de pronostico de la energia eoloelectrica

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Garcia, Alfredo; De la Torre Vega, Eli [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2009-07-01

    The integration of the first Aeolian farm of large scale (La Venta II) to the National Interconnected System requires taking into account the random and discontinuous nature of the Aeolian energy. An important tool, for this task, is a system for the prognosis of the Aeolian energy in the short term. For this reason, the Instituto of Investigaciones Electricas (IIE) developed a statistical model to realize this prognosis. The prediction is done through an adaptable linear combination of alternative competing models, where the weights given to each model are based on its more recent prognosis quality. Also, the application results of the prognoses system are presented and analyzed. [Spanish] La integracion de la primera grana eolica de gran escala (La Venta II) al Sistema Interconectado Nacional requiere tomar en cuenta la naturaleza aleatoria y discontinua de la energia eolica. Una importante herramienta, para esta tarea, es un sistema para el pronostico de la energia eolica a corto plazo. Por ello, el Instituto de Investigaciones Electricas (IIE) desarrollo un modelo estadistico para realizar este pronostico. La prediccion es hecha a traves de una combinacion lineal adaptable de modelos competidores alternativos, donde los pesos dados a cada modelo estan basados en su mas reciente calidad de pronostico. Tambien se presentan y analizan los resultados de la aplicacion del sistema de pronosticos.

  10. Communications and control for electric power systems: Power flow classification for static security assessment

    Science.gov (United States)

    Niebur, D.; Germond, A.

    1993-01-01

    This report investigates the classification of power system states using an artificial neural network model, Kohonen's self-organizing feature map. The ultimate goal of this classification is to assess power system static security in real-time. Kohonen's self-organizing feature map is an unsupervised neural network which maps N-dimensional input vectors to an array of M neurons. After learning, the synaptic weight vectors exhibit a topological organization which represents the relationship between the vectors of the training set. This learning is unsupervised, which means that the number and size of the classes are not specified beforehand. In the application developed in this report, the input vectors used as the training set are generated by off-line load-flow simulations. The learning algorithm and the results of the organization are discussed.

  11. Network Theory Integrated Life Cycle Assessment for an Electric Power System

    Directory of Open Access Journals (Sweden)

    Heetae Kim

    2015-08-01

    Full Text Available In this study, we allocate Greenhouse gas (GHG emissions of electricity transmission to the consumers. As an allocation basis, we introduce energy distance. Energy distance takes the transmission load on the electricity energy system into account in addition to the amount of electricity consumption. As a case study, we estimate regional GHG emissions of electricity transmission loss in Chile. Life cycle assessment (LCA is used to estimate the total GHG emissions of the Chilean electric power system. The regional GHG emission of transmission loss is calculated from the total GHG emissions. We construct the network model of Chilean electric power grid as an undirected network with 466 nodes and 543 edges holding the topology of the power grid based on the statistical record. We analyze the total annual GHG emissions of the Chilean electricity energy system as 23.07 Mt CO2-eq. and 1.61 Mt CO2-eq. for the transmission loss, respectively. The total energy distance for the electricity transmission accounts for 12,842.10 TWh km based on network analysis. We argue that when the GHG emission of electricity transmission loss is estimated, the electricity transmission load should be separately considered. We propose network theory as a useful complement to LCA analysis for the complex allocation. Energy distance is especially useful on a very large-scale electric power grid such as an intercontinental transmission network.

  12. Investigation and assessment of tritium concentration of aquatic environment surrounding haiyang nuclear power plant

    International Nuclear Information System (INIS)

    Objective: To investigate tritium concentrations of aquatic environment surrounding Haiyang nuclear power plant, and make a analysis of the influencial factors of the tritium concentration; to assess the accumulated-effective dose of the residents surrounding nuclear power plant. Methods: We collected 16 sample points, including surface water, groundwater, drinking water and sea water within 30 km surrounding Haiyang nuclear power plant in wet period and dry period. The pretreatment and preparation of samples referred to the recommended methods of the national standards GB 12375-90. The low background liquid scintillation spectrometer is used to measure the tritium concentration. Result: The average level of the tritium concentration of water samples was (0.62 ± 0.163) Bq · L-1, the range of the tritium concentrations was from 0.27Bq · L-1 to 0.93Bq · L-1. The difference of the tritium concentrations between two different periods analyzed by the paired t test was considered statistically significant. (P-1, 0.008 μ Sv · a-1, 0.007 μ Sv · a-1, respectively. Conclusion: The activity concentration of tritium in the aquatic environment surrounding Haiyang nuclear power plant was at the lower level than that of others; according to the limited value that is regulated by basic standards for protection against ionizing radiation and of the safety of radiation sources (GB 18871-2002) (2 mSv), the accumulated-effective dose which residents suffered was in background level of radiation. (authors)

  13. Generation adequacy assessment for power systems with wind turbine and energy storage

    OpenAIRE

    Zhong, J.(Department of Physics, Oxford University, Oxford, United Kingdom); R Zheng

    2010-01-01

    Wind power has been considered as an environmental friendly electrical generation resource; however, the high wind power penetration can lead to high-risk levels in power system reliability. Energy storage system (ESS) is a promising means to smooth variations of wind power and improve the system reliability. Simulation models for assessing generation adequacies of power systems with wind power generation system (WPGS) and ESS are presented in this paper. The impacts of different wind power p...

  14. Structural Performance Assessment Based on Statistical and Wavelet Analysis of Acceleration Measurements of a Building during an Earthquake

    OpenAIRE

    Mosbeh R. Kaloop; Jong Wan Hu; Mohamed A. Sayed; Jiyoung Seong

    2016-01-01

    This study introduces the analysis of structural health monitoring (SHM) system based on acceleration measurements during an earthquake. The SHM system is applied to assess the performance investigation of the administration building in Seoul National University of Education, South Korea. The statistical and wavelet analysis methods are applied to investigate and assess the performance of the building during an earthquake shaking which took place on March 31, 2014. The results indicate that (...

  15. A statistical simulation model for fiels testing of non-target organisms in environmental risk assessment of genetically modified plants

    OpenAIRE

    Goedhart, P.W.; Voet, van der, E.; Baldacchino, F.; Arpaia, S.

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in...

  16. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment.

    Science.gov (United States)

    Henriksson, Patrik J G; Rico, Andreu; Zhang, Wenbo; Ahmad-Al-Nahid, Sk; Newton, Richard; Phan, Lam T; Zhang, Zongfeng; Jaithiang, Jintana; Dao, Hai M; Phu, Tran M; Little, David C; Murray, Francis J; Satapornvanit, Kriengkrai; Liu, Liping; Liu, Qigen; Haque, M Mahfujul; Kruijssen, Froukje; de Snoo, Geert R; Heijungs, Reinout; van Bodegom, Peter M; Guinée, Jeroen B

    2015-12-15

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aquatic food products. Our starting hypothesis was that different production systems are associated with significantly different environmental impacts, as the production of these aquatic species differs in intensity and management practices. In order to test this hypothesis, we estimated each system's global warming, eutrophication, and freshwater ecotoxicity impacts. The contribution to these impacts and the overall dispersions relative to results were propagated by Monte Carlo simulations and dependent sampling. Paired testing showed significant (p shrimp did more than 95% of the propagated Monte Carlo results favor certain farming systems. The major environmental hot-spots driving the differences in environmental performance among systems were fishmeal from mixed fisheries for global warming, pond runoff and sediment discards for eutrophication, and agricultural pesticides, metals, benzalkonium chloride, and other chlorine-releasing compounds for freshwater ecotoxicity. The Asian aquaculture industry should therefore strive toward farming systems relying upon pelleted species-specific feeds, where the fishmeal inclusion is limited and sourced sustainably. Also, excessive nutrients should be recycled in integrated organic agriculture together with efficient aeration solutions powered by renewable energy sources. PMID:26512735

  17. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment.

    Science.gov (United States)

    Henriksson, Patrik J G; Rico, Andreu; Zhang, Wenbo; Ahmad-Al-Nahid, Sk; Newton, Richard; Phan, Lam T; Zhang, Zongfeng; Jaithiang, Jintana; Dao, Hai M; Phu, Tran M; Little, David C; Murray, Francis J; Satapornvanit, Kriengkrai; Liu, Liping; Liu, Qigen; Haque, M Mahfujul; Kruijssen, Froukje; de Snoo, Geert R; Heijungs, Reinout; van Bodegom, Peter M; Guinée, Jeroen B

    2015-12-15

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aquatic food products. Our starting hypothesis was that different production systems are associated with significantly different environmental impacts, as the production of these aquatic species differs in intensity and management practices. In order to test this hypothesis, we estimated each system's global warming, eutrophication, and freshwater ecotoxicity impacts. The contribution to these impacts and the overall dispersions relative to results were propagated by Monte Carlo simulations and dependent sampling. Paired testing showed significant (p < 0.05) differences between the median impacts of most production systems in the intraspecies comparisons, even after a Bonferroni correction. For the full distributions instead of only the median, only for Asian tiger shrimp did more than 95% of the propagated Monte Carlo results favor certain farming systems. The major environmental hot-spots driving the differences in environmental performance among systems were fishmeal from mixed fisheries for global warming, pond runoff and sediment discards for eutrophication, and agricultural pesticides, metals, benzalkonium chloride, and other chlorine-releasing compounds for freshwater ecotoxicity. The Asian aquaculture industry should therefore strive toward farming systems relying upon pelleted species-specific feeds, where the fishmeal inclusion is limited and sourced sustainably. Also, excessive nutrients should be recycled in integrated organic agriculture together with efficient aeration solutions powered by renewable energy sources.

  18. Toward a No-Reference Image Quality Assessment Using Statistics of Perceptual Color Descriptors.

    Science.gov (United States)

    Lee, Dohyoung; Plataniotis, Konstantinos N

    2016-08-01

    Analysis of the statistical properties of natural images has played a vital role in the design of no-reference (NR) image quality assessment (IQA) techniques. In this paper, we propose parametric models describing the general characteristics of chromatic data in natural images. They provide informative cues for quantifying visual discomfort caused by the presence of chromatic image distortions. The established models capture the correlation of chromatic data between spatially adjacent pixels by means of color invariance descriptors. The use of color invariance descriptors is inspired by their relevance to visual perception, since they provide less sensitive descriptions of image scenes against viewing geometry and illumination variations than luminances. In order to approximate the visual quality perception of chromatic distortions, we devise four parametric models derived from invariance descriptors representing independent aspects of color perception: 1) hue; 2) saturation; 3) opponent angle; and 4) spherical angle. The practical utility of the proposed models is examined by deploying them in our new general-purpose NR IQA metric. The metric initially estimates the parameters of the proposed chromatic models from an input image to constitute a collection of quality-aware features (QAF). Thereafter, a machine learning technique is applied to predict visual quality given a set of extracted QAFs. Experimentation performed on large-scale image databases demonstrates that the proposed metric correlates well with the provided subjective ratings of image quality over commonly encountered achromatic and chromatic distortions, indicating that it can be deployed on a wide variety of color image processing problems as a generalized IQA solution. PMID:27305678

  19. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    Science.gov (United States)

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  20. Using Innovative Statistical Analyses to Assess Soil Degradation due to Land Use Change

    Science.gov (United States)

    Khaledian, Yones; Kiani, Farshad; Ebrahimi, Soheila; Brevik, Eric C.; Aitkenhead-Peterson, Jacqueline

    2016-04-01

    Soil erosion and overall loss of soil fertility is a serious issue for loess soils of the Golestan province, northern Iran. The assessment of soil degradation at large watershed scales is urgently required. This research investigated the role of land use change and its effect on soil degradation in cultivated, pasture and urban lands, when compared to native forest in terms of declines in soil fertility. Some novel statistical methods including partial least squares (PLS), principal component regression (PCR), and ordinary least squares regression (OLS) were used to predict soil cation-exchange capacity (CEC) using soil characteristics. PCA identified five primary components of soil quality. The PLS model was used to predict soil CEC from the soil characteristics including bulk density (BD), electrical conductivity (EC), pH, calcium carbonate equivalent (CCE), soil particle density (DS), mean weight diameter (MWD), soil porosity (F), organic carbon (OC), Labile carbon (LC), mineral carbon, saturation percentage (SP), soil particle size (clay, silt and sand), exchangeable cations (Ca2+, Mg2+, K+, Na+), and soil microbial respiration (SMR) collected in the Ziarat watershed. In order to evaluate the best fit, two other methods, PCR and OLS, were also examined. An exponential semivariogram using PLS predictions revealed stronger spatial dependence among CEC [r2 = 0.80, and RMSE= 1.99] than the other methods, PCR [r2 = 0.84, and RMSE= 2.45] and OLS [r2 = 0.84, and RMSE= 2.45]. Therefore, the PLS method provided the best model for the data. In stepwise regression analysis, MWD and LC were selected as influential variables in all soils, whereas the other influential parameters were different in various land uses. This study quantified reductions in numerous soil quality parameters resulting from extensive land-use changes and urbanization in the Ziarat watershed in Northern Iran.

  1. Statistical decision theory and its application to PRA result evaluation for nuclear power plant designing process

    International Nuclear Information System (INIS)

    Decision theory is applied to derive the ''α-th'' percentile and the mean value decision rules which have often been referenced in Probabilistic Risk Assessment (PRA) results. It is shown that the decision problem, with certain kinds of utility functions, yields the above decision rules as well as the criteria of these decision rules. Decision lines are developed as a function of the median and uncertainty factor for an a-priori log normal distribution, and are shown to be useful for decision maker's immediate judgement based on the ''α-th'' percentile and the mean value decision rules. Finally the PWR and BWR release categories of WASH-1400 are evaluated by the developed decision lines with the criteria assuming, as an example, 10-4/reactor.year and 10-3/reactor.year for 95-th percentile decision rule, and 10-5/reactor.year and 10-4/reactor.year respectively

  2. Enhancing an Undergraduate Business Statistics Course: Linking Teaching and Learning with Assessment Issues

    Science.gov (United States)

    Fairfield-Sonn, James W.; Kolluri, Bharat; Rogers, Annette; Singamsetti, Rao

    2009-01-01

    This paper examines several ways in which teaching effectiveness and student learning in an undergraduate Business Statistics course can be enhanced. First, we review some key concepts in Business Statistics that are often challenging to teach and show how using real data sets assist students in developing deeper understanding of the concepts.…

  3. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must al

  4. Statistical Power Law due to Reservoir Fluctuations and the Universal Thermostat Independence Principle

    Directory of Open Access Journals (Sweden)

    Tamás Sándor Biró

    2014-12-01

    Full Text Available Certain fluctuations in particle number, \\(n\\, at fixed total energy, \\(E\\, lead exactly to a cut-power law distribution in the one-particle energy, \\(\\omega\\, via the induced fluctuations in the phase-space volume ratio, \\(\\Omega_n(E-\\omega/\\Omega_n(E=(1-\\omega/E^n\\. The only parameters are \\(1/T=\\langle \\beta \\rangle=\\langle n \\rangle/E\\ and \\(q=1-1/\\langle n \\rangle + \\Delta n^2/\\langle n \\rangle^2\\. For the binomial distribution of \\(n\\ one obtains \\(q=1-1/k\\, for the negative binomial \\(q=1+1/(k+1\\. These results also represent an approximation for general particle number distributions in the reservoir up to second order in the canonical expansion \\(\\omega \\ll E\\. For general systems the average phase-space volume ratio \\(\\langle e^{S(E-\\omega}/e^{S(E}\\rangle\\ to second order delivers \\(q=1-1/C+\\Delta \\beta^2/\\langle \\beta \\rangle^2\\ with \\(\\beta=S^{\\prime}(E\\ and \\(C=dE/dT\\ heat capacity. However, \\(q \

  5. Assessing Colour-dependent Occupation Statistics Inferred from Galaxy Group Catalogues

    CERN Document Server

    Campbell, Duncan; Hearin, Andrew; Padmanabhan, Nikhil; Berlind, Andreas; Mo, H J; Tinker, Jeremy; Yang, Xiaohu

    2015-01-01

    We investigate the ability of current implementations of galaxy group finders to recover colour-dependent halo occupation statistics. To test the fidelity of group catalogue inferred statistics, we run three different group finders used in the literature over a mock that includes galaxy colours in a realistic manner. Overall, the resulting mock group catalogues are remarkably similar, and most colour-dependent statistics are recovered with reasonable accuracy. However, it is also clear that certain systematic errors arise as a consequence of correlated errors in group membership determination, central/satellite designation, and halo mass assignment. We introduce a new statistic, the halo transition probability (HTP), which captures the combined impact of all these errors. As a rule of thumb, errors tend to equalize the properties of distinct galaxy populations (i.e. red vs. blue galaxies or centrals vs. satellites), and to result in inferred occupation statistics that are more accurate for red galaxies than f...

  6. Insulation Diagnosis of Service Aged XLPE Power Cables Using Statistical Analysis and Fuzzy Inference

    Institute of Scientific and Technical Information of China (English)

    LIU Fei; JIANG Pingkai; LEI Qingquan; ZHANG Li; SU Wenqun

    2013-01-01

    Cables that have been in service for over 20 years in Shanghai,a city with abundant surface water,failed more frequently and induced different cable accidents.This necessitates researches on the insulation aging state of cables working in special circumstances.We performed multi-parameter tests with samples from about 300 cable lines in Shanghai.The tests included water tree investigation,tensile test,dielectric spectroscopy test,thermogravimetric analysis (TGA),fourier transform infrared spectroscopy (FTIR),and electrical aging test.Then,we carried out regression analysis between every two test parameters.Moreover,through two-sample t-Test and analysis of variance (ANOVA) of each test parameter,we analyzed the influences of cable-laying method and sampling section on the degradation of cable insulation respectively.Furthermore,the test parameters which have strong correlation in the regression analysis or significant differences in the t-Test or ANOVA analysis were determined to be the ones identifying the XLPE cable insulation aging state.The thresholds for distinguishing insulation aging states had been also obtained with the aid of statistical analysis and fuzzy clustering.Based on the fuzzy inference,we established a cable insulation aging diagnosis model using the intensity transfer method.The results of regression analysis indicate that the degradation of cable insulation accelerates as the degree of in-service aging increases.This validates the rule that the increase of microscopic imperfections in solid material enhances the dielectric breakdown strength.The results of the two-sample t-Test and the ANOVA indicate that the direct-buried cables are more sensitive to insulation degradation than duct cables.This confirms that the tensile strength and breakdown strength are reliable functional parameters in cable insulation evaluations.A case study further indicates that the proposed diagnosis model based on the fuzzy inference can reflect the comprehensive

  7. ASSESSMENT OF THE DRUM REMAINING LIFETIME IN THERMAL POWER PLANT

    Directory of Open Access Journals (Sweden)

    Miroslav M Živković

    2010-01-01

    Full Text Available In this paper analysis of stress and thermal-elastic-plastic strain of the drum is performed. Influence of modified thickness, yield stress and finite element model of welded joint between pipe and drum on assessment of the remaining lifetime of the drum in the thermal power plant is analyzed. Two analyses are compared. In the first, drum is modeled by shell and by 3D finite elements with projected geometrical and material data of drum. Then, the drum is modeled by shell and by 3D finite elements with modified thickness and yield stress. The analysis show that detailed modeling of stress concentration zones is necessary. Adequate modeling gives lower maximal effective plastic strain and increased number of cycles and, in that case, 3D finite elements are better comparing to shell finite elements.

  8. Assessment on thermoelectric power factor in silicon nanowire networks

    Energy Technology Data Exchange (ETDEWEB)

    Lohn, Andrew J.; Kobayashi, Nobuhiko P. [Baskin School of Engineering, University of California Santa Cruz, CA (United States); Nanostructured Energy Conversion Technology and Research (NECTAR), Advanced Studies Laboratories, University of California Santa Cruz, NASA Ames Research Center, Moffett Field, CA (United States); Coleman, Elane; Tompa, Gary S. [Structured Materials Industries, Inc., Piscataway, NJ (United States)

    2012-01-15

    Thermoelectric devices based on three-dimensional networks of highly interconnected silicon nanowires were fabricated and the parameters that contribute to the power factor, namely the Seebeck coefficient and electrical conductivity were assessed. The large area (2 cm x 2 cm) devices were fabricated at low cost utilizing a highly scalable process involving silicon nanowires grown on steel substrates. Temperature dependence of the Seebeck coefficient was found to be weak over the range of 20-80 C at approximately -400 {mu}V/K for unintentionally doped devices and {+-}50 {mu}V/K for p-type and n-type devices, respectively. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. ASTRID power conversion system: Assessment on steam and gas options

    International Nuclear Information System (INIS)

    Conclusion: ◆ Two power conversion systems have been investigated for the ASTRID prototype. ◆ Steam PCS: • Most mature system based on a well-developed turbomachinery technology. • High plant efficiency. • Studies on steam generators designs and leak detection systems in progress with the aim of reducing the risk of large SWRs and of limiting its consequences. • Design and licensing safety assessment of a SFR must deal with the Sodium Water Air reaction (SWAR). ◆ Gas PCS: • Strong advantage as it inherently eliminates the SWR and SWAR risks. • Very innovative option: major breakthroughs but feasibility and viability not yet demonstrated. • Remaining technological challenges but no showstopper indentified. • General architecture: investigations in progress to improve performances, operability and maintainability

  10. Assessment of statistical procedures used in papers in the Australian Veterinary Journal.

    Science.gov (United States)

    McCance, I

    1995-09-01

    One hundred and thirty-three papers (80 Original Articles and 53 Short Contributions) of 279 papers in 23 consecutive issues of the Australian Veterinary Journal were examined for their statistical content. Only 38 (29%) would have been acceptable to a statistical referee without revision, revision would have been indicated in 88 (66%), and the remaining 7 (5%) had major flaws. Weaknesses in design were found in 40 (30%), chiefly in respect to randomisation and to the size of the experiment. Deficiencies in analysis in 60 (45%) were in methods, application and calculation, and in the failure to use appropriate methods for multiple comparisons and repeated measures. Problems were detected in presentation in 44 (33%) of papers, with insufficient information about the data or its statistical analysis and presentation of statistics (appropriate missing or inappropriate shown) the main problems. Conclusions were considered to be inconsistent with the analysis in 35 (26%) of papers, due mainly to their interpretation of the results of significance testing. It is suggested that statistical refereeing, the publication of statistical guidelines for authors and statistical advice to Animal Experimentation Ethics Committees could all play a part in achieving improvement. PMID:8585846

  11. Sensitivity and uncertainty analyses in external cost assessments of fusion power

    Energy Technology Data Exchange (ETDEWEB)

    Aquilonius, K. E-mail: karin.aquilonius@studsvik.se; Hallberg, B.; Hofman, D.; Bergstroem, U.; Lechon, Y.; Cabal, H.; Saez, R.M.; Schneider, T.; Lepicard, S.; Ward, D.; Hamacher, T.; Korhonen, R

    2001-11-01

    Analysis of sensitivity and uncertainty of assessment models for external costs, which is monetarization of environmental impacts, of a commercial fusion plant were performed. The assessments covered the plant's entire life cycle, and adopted the ExternE methodology, which had been used to calculate external costs from other energy sources. Based on the SEAFP study, three different power plant designs were considered. The method developed in ExternE to estimate uncertainty gave very large ranges. A statistical error propagation method was employed for this study. Rather than as a single value, model input parameter values were given as distributions, from which random input sets of data were constructed. The models were then run with these sets, and the ensemble of output results was analysed statistically, yielding estimates of the uncertainty due to variation of the model parameteres. More information of parameter variation is needed for a more realistic estimation of model uncertainty, though. Sensitivity analyses were performed by varying all input parameters in a similar fashion. All model parameters were assumed to have a gaussian distribution with standard deviations of 10% of the mean value. The results pointed out the most essential parameters of the models. The sensitivity analyses are also useful for estimating the most effective ways to reduce the model computed external costs.

  12. THE APPLICATION OF STATISTICAL PARAMETERS OF PHASE RESOLVED PD DISTRIBUTION TO AGING EXTENT ASSESSMENT OF LARGE GENERATOR INSULATION

    Institute of Scientific and Technical Information of China (English)

    谢恒堃; 乐波; 孙翔; 宋建成

    2003-01-01

    Objective To investigate the characteristic parameters employed to describe the aging extent of stator insulation of large generator and study the aging laws. Methods Multi-stress aging tests of model generator stator bar specimens were performed and PD measurements were conducted using digital PD detector with frequency range from 40*!kHz to 400*!kHz at different aging stage. Results From the test results of model specimens it was found that the skewness of phase resolved PD distribution might be taken as the characterization parameters for aging extent assessment of generator insulation. Furthermore, the measurement results of actual generator stator bars showed that the method based on statistical parameters of PD distributions are prospective for aging extent assessment and residual lifetime estimation of large generator insulation. Conclusion Statistical parameters of phase resolved PD distribution was proposed for aging extent assessment of large generator insulation.

  13. Regional wind energy assessment program progress report, October 1980-September 1981. Appendix. Wind statistics summaries

    Energy Technology Data Exchange (ETDEWEB)

    Baker, R W; Wade, J E; Persson, P O.G.; Armstrong, B

    1981-12-01

    The wind statistics summarized include monthly wind speed and spectrum analyzer summaries, diurnal wind speed tables, high wind summaries (greater than or equal to 50 mph), wind rose tables, and wind speed and direction frequency distributions. (LEW)

  14. Scan statistic tail probability assessment based on process covariance and window size

    OpenAIRE

    Reiner-Benaim, Anat

    2013-01-01

    A scan statistic is examined for the purpose of testing the existence of a global peak in a random process with dependent variables of any distribution. The scan statistic tail probability is obtained based on the covariance of the moving sums process, thereby accounting for the spatial nature of the data as well as the size of the searching window. Exact formulas linking this covariance to the window size and the correlation coefficient are developed under general, common and auto covariance...

  15. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Science.gov (United States)

    Tonkin, Matthew J.; Tiedeman, Claire R.; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  16. Assessment of rockfall susceptibility by integrating statistical and physically-based approaches

    Science.gov (United States)

    Frattini, Paolo; Crosta, Giovanni; Carrara, Alberto; Agliardi, Federico

    In Val di Fassa (Dolomites, Eastern Italian Alps) rockfalls constitute the most significant gravity-induced natural disaster that threatens both the inhabitants of the valley, who are few, and the thousands of tourists who populate the area in summer and winter. To assess rockfall susceptibility, we developed an integrated statistical and physically-based approach that aimed to predict both the susceptibility to onset and the probability that rockfalls will attain specific reaches. Through field checks and multi-temporal aerial photo-interpretation, we prepared a detailed inventory of both rockfall source areas and associated scree-slope deposits. Using an innovative technique based on GIS tools and a 3D rockfall simulation code, grid cells pertaining to the rockfall source-area polygons were classified as active or inactive, based on the state of activity of the associated scree-slope deposits. The simulation code allows one to link each source grid cell with scree deposit polygons by calculating the trajectory of each simulated launch of blocks. By means of discriminant analysis, we then identified the mix of environmental variables that best identifies grid cells with low or high susceptibility to rockfalls. Among these variables, structural setting, land use, and morphology were the most important factors that led to the initiation of rockfalls. We developed 3D simulation models of the runout distance, intensity and frequency of rockfalls, whose source grid cells corresponded either to the geomorphologically-defined source polygons ( geomorphological scenario) or to study area grid cells with slope angle greater than an empirically-defined value of 37° ( empirical scenario). For each scenario, we assigned to the source grid cells an either fixed or variable onset susceptibility; the latter was derived from the discriminant model group (active/inactive) membership probabilities. Comparison of these four models indicates that the geomorphological scenario with

  17. A Fractional Lower Order Statistics-Based MIMO Detection Method in Impulse Noise for Power Line Channel

    Directory of Open Access Journals (Sweden)

    CHEN, Z.

    2014-11-01

    Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.

  18. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    Science.gov (United States)

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  19. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  20. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    Science.gov (United States)

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  1. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia).

    Science.gov (United States)

    Caneva, G; Bartoli, F; Savo, V; Futagami, Y; Strona, G

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  2. Animal-powered tillage erosion assessment in the southern Andes region of Ecuador

    Science.gov (United States)

    Dercon, G.; Govers, G.; Poesen, J.; Sánchez, H.; Rombaut, K.; Vandenbroeck, E.; Loaiza, G.; Deckers, J.

    2007-06-01

    While water erosion has been the focus of past research in the Andes, former studies show that soil erosion could also be related to the methods used in cultivating the fields. The main objective of the present study was to assess (i) tillage erosion caused by the traditional animal-powered "yunta" or ard plough in the Andes and the factors controlling the process and (ii) the implications for soil conservation. Erosion rates were experimentally measured on 27 sites, having slopes from ca. 0% to 60% and soils ranging from Andosols to Cambisols, in the Andes region of Ecuador (Gima, Azuay). Different tillage methods were assessed: (i) tillage parallel to the contour lines ('Paralelo') and (ii) tillage at an angle with the contour lines. Statistical analysis points out that erosion caused by animal-powered tillage is gravity-driven. A strong correlation exists between slope and downslope displacement: furthermore, tillage depth and initial soil condition are important. For the 'Paralelo' tillage method the tillage transportation coefficient ( k) is below 100 kg m - 1 Tillage Pass - 1 , for the combined 'Arado'-'Cruzado' tillage method k may exceed 300 kg m - 1 . Tillage erosion is responsible for the reduction of the slope between the contour strips over a relatively short time period of 20 years, resulting in the formation of terraces and therefore the reduction of the water erosion risk. However, at the same time it may negatively affect soil quality.

  3. A statistical assessment of pesticide pollution in surface waters using environmental monitoring data: Chlorpyrifos in Central Valley, California.

    Science.gov (United States)

    Wang, Dan; Singhasemanon, Nan; Goh, Kean S

    2016-11-15

    Pesticides are routinely monitored in surface waters and resultant data are analyzed to assess whether their uses will damage aquatic eco-systems. However, the utility of the monitoring data is limited because of the insufficiency in the temporal and spatial sampling coverage and the inability to detect and quantify trace concentrations. This study developed a novel assessment procedure that addresses those limitations by combining 1) statistical methods capable of extracting information from concentrations below changing detection limits, 2) statistical resampling techniques that account for uncertainties rooted in the non-detects and insufficient/irregular sampling coverage, and 3) multiple lines of evidence that improve confidence in the final conclusion. This procedure was demonstrated by an assessment on chlorpyrifos monitoring data in surface waters of California's Central Valley (2005-2013). We detected a significant downward trend in the concentrations, which cannot be observed by commonly-used statistical approaches. We assessed that the aquatic risk was low using a probabilistic method that works with non-detects and has the ability to differentiate indicator groups with varying sensitivity. In addition, we showed that the frequency of exceedance over ambient aquatic life water quality criteria was affected by pesticide use, precipitation and irrigation demand in certain periods anteceding the water sampling events.

  4. Discrimination power of short-term heart rate variability measures for CHF assessment.

    Science.gov (United States)

    Pecchia, Leandro; Melillo, Paolo; Sansone, Mario; Bracale, Marcello

    2011-01-01

    In this study, we investigated the discrimination power of short-term heart rate variability (HRV) for discriminating normal subjects versus chronic heart failure (CHF) patients. We analyzed 1914.40 h of ECG of 83 patients of which 54 are normal and 29 are suffering from CHF with New York Heart Association (NYHA) classification I, II, and III, extracted by public databases. Following guidelines, we performed time and frequency analysis in order to measure HRV features. To assess the discrimination power of HRV features, we designed a classifier based on the classification and regression tree (CART) method, which is a nonparametric statistical technique, strongly effective on nonnormal medical data mining. The best subset of features for subject classification includes square root of the mean of the sum of the squares of differences between adjacent NN intervals (RMSSD), total power, high-frequencies power, and the ratio between low- and high-frequencies power (LF/HF). The classifier we developed achieved sensitivity and specificity values of 79.3 % and 100 %, respectively. Moreover, we demonstrated that it is possible to achieve sensitivity and specificity of 89.7 % and 100 %, respectively, by introducing two nonstandard features ΔAVNN and ΔLF/HF, which account, respectively, for variation over the 24 h of the average of consecutive normal intervals (AVNN) and LF/HF. Our results are comparable with other similar studies, but the method we used is particularly valuable because it allows a fully human-understandable description of classification procedures, in terms of intelligible "if … then …" rules. PMID:21075731

  5. Direct integration of intensity-level data from Affymetrix and Illumina microarrays improves statistical power for robust reanalysis

    Directory of Open Access Journals (Sweden)

    Turnbull Arran K

    2012-08-01

    Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.

  6. Statistical Analysis of Wind Power Density Based on the Weibull and Rayleigh Models of Selected Site in Malaysia

    Directory of Open Access Journals (Sweden)

    Aliashim Albani

    2014-02-01

    Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.

  7. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    Science.gov (United States)

    Mittendorfer, J.; Zwanziger, P.

    2000-03-01

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfil special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfil these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes.

  8. 75 FR 2164 - Entergy Nuclear Operations, Inc.; Pilgrim Nuclear Power Station; Environmental Assessment and...

    Science.gov (United States)

    2010-01-14

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Entergy Nuclear Operations, Inc.; Pilgrim Nuclear Power Station; Environmental Assessment and...), for operation of Pilgrim Nuclear Power Station (Pilgrim), located in Plymouth County, MA....

  9. The assessment of the environmental external costs of power plants for both coal-fired plant and nuclear power plant

    International Nuclear Information System (INIS)

    Efforts were made to assess the environmental external costs of power plants for both Samchonpo coal-fired plant and Younggwang nuclear power plant by using the computer program developed by the IAEA. In the case that the emission control devices such as precipitator for particulates reduction, wet scrubber for SO2, and low-NOx burner for NOx were installed in the coal-fired power plant, total environmental external cost was estimated as 33.97Won/kWh, much higher than 0.76Won/kWh of Younggwang nuclear power plant. And this study also assessed and compared the environmental external costs when Younggwang nuclear power plant was replaced by the coal-fired power plant at the same site and with the same capacity. According to the result, total environmental external cost of coal-fired power plant, with the emisison control devices installed, was estimated as 792 million US$ and it was about 50 times higher than 15 million US$ of Younggwang nuclear power plant. Although the result of this study had some limits due to using the simplified model, it was still true that nuclear power was much more environmentally friendly power source than coal-fired power

  10. Assessing the decennial, reassessing the global:Understanding European Union normative power in global politics

    OpenAIRE

    Manners, Ian James

    2013-01-01

    This concluding article assesses the past decade of international scholarship on the European Union (EU) and normative power as represented by the contributions to the special issue. It argues that the normative power approach (NPA) makes it possible to explain, understand and judge the EU in global politics by rethinking the nature of power and actorness in a globalizing, multilateralizing and multipolarizing era. To do this, the article assesses the past decade in terms of normative power e...

  11. Wind power in Eritrea, Africa: A preliminary resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, K.; Rosen, K. [San Jose State Univ., CA (United States); Van Buskirk, R. [Dept. of Energy, Eritrea (Ethiopia)

    1997-12-31

    The authors preliminary assessment of Eritrean wind energy potential identified two promising regions: (1) the southeastern Red Sea coast and (2) the mountain passes that channel winds between the coastal lowlands and the interior highlands. The coastal site, near the port city of Aseb, has an exceptionally good resource, with estimated average annual wind speeds at 10-m height above 9 m/s at the airport and 7 m/s in the port. Furthermore, the southern 200 km of coastline has offshore WS{sub aa} > 6 m/s. This area has strong potential for development, having a local 20 MW grid and unmet demand for the fishing industry and development. Although the highland sites contain only marginal wind resources ({approximately} 5 m/s), they warrant further investigation because of their proximity to the capital city, Asmera, which has the largest unmet demand and a larger power grid (40 MW with an additional 80 MW planned) to absorb an intermittent source without storage.

  12. Steam generator assessment for sustainable power plant operation

    International Nuclear Information System (INIS)

    Water and steam serve in the water-steam cycle as the energy transport and work media. These fluids shall not affect, through corrosion processes on the construction materials and their consequences, undisturbed plant operation. The main objectives of the steam water cycle chemistry consequently are: - The metal release rates of the structural materials shall be minimal - The probability of selective / localized forms of corrosion shall be minimal. - The deposition of corrosion products on heat transfer surfaces shall be minimized. - The formation of aggressive media, particularly local aggressive environments under deposits, shall be avoided. These objectives are especially important for the steam generators (SGs) because their condition is a key factor for plant performance, high plant availability, life time extension and is important to NPP safety. The major opponent to that is corrosion and fouling of the heating tubes. Effective ways of counteracting all degradation problems and thus of improving the SG performance are to keep SGs in clean conditions or if necessary to plan cleaning measures such as mechanical tube sheet lancing or chemical cleaning. Based on more than 40 years of experience in steam-water cycle water chemistry treatment AREVA developed an overall methodology assessing the steam generator cleanliness condition by evaluating all available operational and inspection data together. In order to gain a complete picture all relevant water chemistry data (e.g. corrosion product mass balances, impurity ingress), inspection data (e.g. visual inspections and tube sheet lancing results) and thermal performance data (e.g. heat transfer calculations) are evaluated, structured and indexed using the AREVA Fouling Index Tool Box. This Fouling Index Tool Box is more than a database or statistical approach for assessment of plant chemistry data. Furthermore the AREVA's approach combines manufacturer's experience with plant data and operates with an

  13. Transient Stability Assessment of Smart Power System using Complex Networks Framework

    CERN Document Server

    Nasiruzzaman, A B M

    2011-01-01

    In this paper, a new methodology for stability assessment of a smart power system is proposed. The key to this assessment is an index called betweenness index which is based on ideas from complex network theory. The proposed betweenness index is an improvement of previous works since it considers the actual real power flow through the transmission lines along the network. Furthermore, this work initiates a new area for complex system research to assess the stability of the power system.

  14. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.

  15. Assessing the Disconnect between Grade Expectation and Achievement in a Business Statistics Course

    Science.gov (United States)

    Berenson, Mark L.; Ramnarayanan, Renu; Oppenheim, Alan

    2015-01-01

    In an institutional review board--approved study aimed at evaluating differences in learning between a large-sized introductory business statistics course section using courseware assisted examinations compared with small-sized sections using traditional paper-and-pencil examinations, there appeared to be a severe disconnect between the final…

  16. Probabilistic risk assessment course documentation. Volume 2. Probability and statistics for PRA applications

    International Nuclear Information System (INIS)

    This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics

  17. Inter-speaker speech variability assessment using statistical deformable models from 3.0 Tesla magnetic resonance images

    OpenAIRE

    Maria JM Vasconcelos; Sandra MR Ventura; Diamantino RS Freitas; João Manuel RS Tavares

    2012-01-01

    The morphological and dynamic characterization of the vocal tract during speech production has been gaining greater attention due to the motivation of the latest improvements in Magnetic Resonance (MR) imaging; namely, with the use of higher magnetic fields, such as 3.0 Tesla. In this work, the automatic study of the vocal tract from 3.0 Tesla MR images was assessed through the application of statistical deformable models. Therefore, the primary goal focused on the analysis of the shape of th...

  18. Peer Assessment Enhances Student Learning: The Results of a Matched Randomized Crossover Experiment in a College Statistics Class.

    Science.gov (United States)

    Sun, Dennis L; Harris, Naftali; Walther, Guenther; Baiocchi, Michael

    2015-01-01

    Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.

  19. Peer Assessment Enhances Student Learning: The Results of a Matched Randomized Crossover Experiment in a College Statistics Class.

    Directory of Open Access Journals (Sweden)

    Dennis L Sun

    Full Text Available Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.

  20. Hanford groundwater modeling: statistical methods for evaluating uncertainty and assessing sampling effectiveness

    International Nuclear Information System (INIS)

    This report is the first in a series of three documents which address the role of uncertainty in the Rockwell Hanford Operations groundwater model development and application program at Hanford Site. Groundwater data collection activities at Hanford are reviewed as they relate to Rockwell groundwater modeling. Methods of applying statistical and probability theory in quantifying the propagation of uncertainty from field measurements to model predictions are discussed. It is shown that measures of model accuracy or uncertainty provided by a statistical analysis can be useful in guiding model development and sampling network design. Recommendations are presented in the areas of model input data needs, parameter estimation data needs, and model verification and variance estimation data needs. 8 figures

  1. Hanford groundwater modeling: statistical methods for evaluating uncertainty and assessing sampling effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, D.B.

    1979-01-01

    This report is the first in a series of three documents which address the role of uncertainty in the Rockwell Hanford Operations groundwater model development and application program at Hanford Site. Groundwater data collection activities at Hanford are reviewed as they relate to Rockwell groundwater modeling. Methods of applying statistical and probability theory in quantifying the propagation of uncertainty from field measurements to model predictions are discussed. It is shown that measures of model accuracy or uncertainty provided by a statistical analysis can be useful in guiding model development and sampling network design. Recommendations are presented in the areas of model input data needs, parameter estimation data needs, and model verification and variance estimation data needs. 8 figures.

  2. A statistical toolbox for metagenomics: assessing functional diversity in microbial communities

    OpenAIRE

    Handelsman Jo; Schloss Patrick D

    2008-01-01

    Abstract Background The 99% of bacteria in the environment that are recalcitrant to culturing have spurred the development of metagenomics, a culture-independent approach to sample and characterize microbial genomes. Massive datasets of metagenomic sequences have been accumulated, but analysis of these sequences has focused primarily on the descriptive comparison of the relative abundance of proteins that belong to specific functional categories. More robust statistical methods are needed to ...

  3. First Aspect of Conventional Power System Assessment for High Wind Power Plants Penetration

    Directory of Open Access Journals (Sweden)

    A Merzic

    2012-11-01

    Full Text Available Most power systems in underdeveloped and developing countries are based on conventional power plants, mainly "slow-response" thermal power plants and a certain number of hydro power plants; characterized by inflexible generating portfolios and traditionally designed to meet own electricity needs. Taking into account operational capabilities of conventional power systems, their development planning will face problems with integration of notable amounts of installed capacities in wind power plants (WPP. This is what highlights the purpose of this work and in that sense, here, possible variations of simulated output power from WPP in the 10 minute and hourly time interval, which need to be balanced, are investigated, presented and discussed. Comparative calculations for the amount of installed power in WPP that can be integrated into a certain power system, according to available secondary balancing power amounts, in case of concentrated and dispersed future WPP are given. The stated has been done using a part of the power system of Bosnia and Herzegovina. In the considered example, by planned geographically distributed WPP construction, even up to cca. 74% more in installed power of WPP can be integrated into the power system than in case of geographically concentrated WPP construction, for the same available amount of (secondary balancing power. These calculations have shown a significant benefit of planned, geographically distributed WPP construction, as an important recommendation for the development planning of conventional power systems, with limited balancing options. Keywords: balancing reserves,  geographical dispersion, output power  variations

  4. A Participatory Approach to Develop the Power Mobility Screening Tool and the Power Mobility Clinical Driving Assessment Tool

    Directory of Open Access Journals (Sweden)

    Deepan C. Kamaraj

    2014-01-01

    Full Text Available The electric powered wheelchair (EPW is an indispensable assistive device that increases participation among individuals with disabilities. However, due to lack of standardized assessment tools, developing evidence based training protocols for EPW users to improve driving skills has been a challenge. In this study, we adopt the principles of participatory research and employ qualitative methods to develop the Power Mobility Screening Tool (PMST and Power Mobility Clinical Driving Assessment (PMCDA. Qualitative data from professional experts and expert EPW users who participated in a focus group and a discussion forum were used to establish content validity of the PMCDA and the PMST. These tools collectively could assess a user’s current level of bodily function and their current EPW driving capacity. Further multicenter studies are necessary to evaluate the psychometric properties of these tests and develop EPW driving training protocols based on these assessment tools.

  5. Safety assessment for the passive system of the nuclear power plants (NPPs) using safety margin estimation

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Tae-Ho; Lee, Un-Chul [Department of Nuclear Engineering, Seoul National University, Gwanak 599, Gwanak-ro, Gwanak-gu, Seoul 151-742 (Korea)

    2010-04-15

    The probabilistic safety assessment (PSA) for gas-cooled nuclear power plants has been investigated where the operational data are deficient, because there is not any commercial gas-cooled nuclear power plant. Therefore, it is necessary to use the statistical data for the basic event constructions. Several estimations for the safety margin are introduced for the quantification of the failure frequency in the basic event, which is made by the concept of the impact and affordability. Trend of probability of failure (TPF) and fuzzy converter (FC) are introduced using the safety margin, which shows the simplified and easy configurations for the event characteristics. The mass flow rate in the natural circulation is studied for the modeling. The potential energy in the gravity, the temperature and pressure in the heat conduction, and the heat transfer rate in the internal stored energy are also investigated. The values in the probability set are compared with those of the fuzzy set modeling. Non-linearity of the safety margin is expressed by the fuzziness of the membership function. This artificial intelligence analysis of the fuzzy set could enhance the reliability of the system comparing to the probabilistic analysis. (author)

  6. Statistical analysis about corrosion in nuclear power plants; Analisis estadistico de la corrosion en centrales nucleares de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Naquid G, C.; Medina F, A.; Zamora R, L. [Instituto Nacional de Investigaciones Nucleares, Gerencia de Ciencia de Materiales, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  7. Life cycle assessment of a pumped storage power plant

    OpenAIRE

    Torres, Octavio

    2011-01-01

    Wind and solar power plants are gaining increasing attention due to low green house gas emissions associated with electricity generation. The installed capacity of these resources is rapidly growing, while it is argued that the stability of the grid is threatened since these resources depend on actual weather conditions and their output cannot be easily adjusted to follow instantenous electricity demand. Another reliable low carbon power supply such as nuclear power plants cannot help in stab...

  8. Integration of Remote Sensing Techniques With Statistical Methods For Landslide Monitoring and Risk Assessment

    Science.gov (United States)

    van Westen, Cees; Wunderle, Stefan; Pasquali, Paolo

    In the frame of the Date User Program 2 (DUP) of the European Space Agency (ESA) a new method will be presented to derive landslide hazards, which was developed in close co-operation with the end users in Honduras and Switzerland, respectively. The objective of thi s project is to define a sustainable service using the novel approach based on the fusion of two independent methods, namely combining differential SAR Interferometry techniques (DInSAR) with a statistical approach. The bivariate statistical analysis is based on parameter maps (slope, geomorphology, land use) derived from remote sensing data and field checks as well as on historical aerial photos. The hybrid method is based on SAR data of the last years and new ENVISAT-ASAR data as well as historical data (i.e. former landslides detected in aerial photos), respectively. The historical occurrence of landslides will be combined with actual land sliding and creeping obtained from DInSAR. The landslide occurrence map in high quality forms the input for the statistical landslide hazard analysis. The method intends to derive information on landslide hazards, preferably in the form of probabilities, which will be combined with information on building stock, infrastructure and population density. The vulnerability of population and infrastructure will be taken into account by a weighting factor. The resulting risk maps will be of great value for local authorities, Comisión Permanente de Contingencias (COPECO) of Honduras, local GIS specialists, policy makers and reinsurance companies. We will show the results of the Service Definition Project with some examples of the new method especially for Tegucigalpa the capital of Honduras with approximately 1 million inhabitants.

  9. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    Science.gov (United States)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  10. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    CERN Document Server

    Choquet, É; Soummer, R; Perrin, M D; Hagan, J B; Gofas-Salas, E; Rajan, A; Aguilar, J

    2015-01-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  11. Development on Vulnerability Assessment Methods of PPS of Nuclear Power Plants

    Institute of Scientific and Technical Information of China (English)

    MIAO; Qiang; ZHANG; Wen-liang; ZONG; Bo; BU; Li-xin; YIN; Hong-he; FANG; Xin

    2012-01-01

    <正>We present a set of vulnerability assessment methods of physical protection system (PPS) of nuclear power plants after investigating and collecting the experience of assessment in China. The methods have important significance to strengthen and upgrade the security of the nuclear power plants, and also to

  12. Satellite Power Systems (SPS): Concept development and evaluation program: Preliminary assessment

    Science.gov (United States)

    1979-01-01

    A preliminary assessment of a potential Satellite Power System (SPS) is provided. The assessment includes discussion of technical and economic feasibility; the effects of microwave power transmission beams on biological, ecological, and electromagnetic systems; the impact of SPS construction, deployment, and operations on the biosphere and on society; and the merits of SPS compared to other future energy alternatives.

  13. A statistical toolbox for metagenomics: assessing functional diversity in microbial communities

    Directory of Open Access Journals (Sweden)

    Handelsman Jo

    2008-01-01

    Full Text Available Abstract Background The 99% of bacteria in the environment that are recalcitrant to culturing have spurred the development of metagenomics, a culture-independent approach to sample and characterize microbial genomes. Massive datasets of metagenomic sequences have been accumulated, but analysis of these sequences has focused primarily on the descriptive comparison of the relative abundance of proteins that belong to specific functional categories. More robust statistical methods are needed to make inferences from metagenomic data. In this study, we developed and applied a suite of tools to describe and compare the richness, membership, and structure of microbial communities using peptide fragment sequences extracted from metagenomic sequence data. Results Application of these tools to acid mine drainage, soil, and whale fall metagenomic sequence collections revealed groups of peptide fragments with a relatively high abundance and no known function. When combined with analysis of 16S rRNA gene fragments from the same communities these tools enabled us to demonstrate that although there was no overlap in the types of 16S rRNA gene sequence observed, there was a core collection of operational protein families that was shared among the three environments. Conclusion The results of comparisons between the three habitats were surprising considering the relatively low overlap of membership and the distinctively different characteristics of the three habitats. These tools will facilitate the use of metagenomics to pursue statistically sound genome-based ecological analyses.

  14. Uranium resource assessment through statistical analysis of exploration geochemical and other data. Final report

    International Nuclear Information System (INIS)

    We have developed a procedure that can help quadrangle evaluators to systematically summarize and use hydrogeochemical and stream sediment reconnaissance (HSSR) and occurrence data. Although we have not provided an independent estimate of uranium endowment, we have devised a methodology that will provide this independent estimate when additional calibration is done by enlarging the study area. Our statistical model for evaluation (system EVAL) ranks uranium endowment for each quadrangle. Because using this model requires experience in geology, statistics, and data analysis, we have also devised a simplified model, presented in the package SURE, a System for Uranium Resource Evaluation. We have developed and tested these models for the four quadrangles in southern Colorado that comprise the study area; to investigate their generality, the models should be applied to other quandrangles. Once they are calibrated with accepted uranium endowments for several well-known quadrangles, the models can be used to give independent estimates for less-known quadrangles. The point-oriented models structure the objective comparison of the quandrangles on the bases of: (1) Anomalies (a) derived from stream sediments, (b) derived from waters (stream, well, pond, etc.), (2) Geology (a) source rocks, as defined by the evaluator, (b) host rocks, as defined by the evaluator, and (3) Aerial radiometric anomalies

  15. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.;

    2015-01-01

    and proper detection of power production changes is demonstrated in cases of icing, power derating, operation under noise reduction mode, and incorrect controller input signal. Finally, overviews are illustrated for parks subjected to icing and operating under limited rotational speed. The comparison between...

  16. Safety review, assessment and inspection for nuclear power plants

    International Nuclear Information System (INIS)

    Qinshan Nuclear Power Plant started first shut down for refuelling and overhaul in October, 1994. The two units of Guangdong Nuclear Power Station also started first shutdown for refuelling and overhaul in December 1994 and in April, 1995 respectively. Hence besides to conduct a routine operational inspection, the NNSA laid stress on the safety supervision of the first refuelling for two nuclear power plants, especially the treatment of event that the drop time of its control rods exceed criteria for the Unit 1 of GNPS. In the course of implementing supervision on the refuelling for nuclear power plants, the NNSA drew experience from foreign nuclear safety authorities to the practice of supervision during the commissioning stage for nuclear power plants, the inspection programs were prepared for outage respectively. The NNSA concerted closely with its regional offices to conduct a routine inspection and to combine with a special item inspection, to ensure the effective implementation of inspections

  17. 75 FR 12311 - Entergy Nuclear Operations, Inc; Vermont Yankee Nuclear Power Station Environmental Assessment...

    Science.gov (United States)

    2010-03-15

    ... COMMISSION Entergy Nuclear Operations, Inc; Vermont Yankee Nuclear Power Station Environmental Assessment and... Nuclear Operations, Inc. (Entergy or the licensee), for operation of Vermont Yankee Nuclear Power Station... no significant impact [part 73, Power Reactor Security Requirements, 74 FR 13926, 13967 (March...

  18. 75 FR 11575 - James A. Fitzpatrick Nuclear Power Plant Environmental Assessment and Finding of No Significant...

    Science.gov (United States)

    2010-03-11

    ... COMMISSION James A. Fitzpatrick Nuclear Power Plant Environmental Assessment and Finding of No Significant... Program for Nuclear Power Facilities Operating Prior to January 1, 1979,'' issued to Entergy Nuclear Operations, Inc. (the licensee), for the operation of the James A. FitzPatrick Nuclear Power Plant...

  19. 76 FR 58050 - Tennessee Valley Authority, Bellefonte Nuclear Power Plant, Unit 1; Environmental Assessment and...

    Science.gov (United States)

    2011-09-19

    ... COMMISSION Tennessee Valley Authority, Bellefonte Nuclear Power Plant, Unit 1; Environmental Assessment and... Impacts Nuclear power plants use waste treatment systems designed to collect, process, and dispose of.... As previously discussed, disposal of hazardous chemicals used at nuclear power plants are...

  20. Condition assessment of power cables using partial discharge diagnosis at damped AC voltages

    NARCIS (Netherlands)

    Wester, F.J.

    2004-01-01

    The thesis focuses on the condition assessment of the distribution power cables, which have a very critical part in the distribution of electrical power over regional distances. The majority of the outages in the power system is related to the distribution cables, of which for more than 60% to inter

  1. New statistical methodology, mathematical models, and data bases relevant to the assessment of health impacts of energy technologies

    International Nuclear Information System (INIS)

    The present research develops new statistical methodology, mathematical models, and data bases of relevance to the assessment of health impacts of energy technologies, and uses these to identify, quantify, and pedict adverse health effects of energy related pollutants. Efforts are in five related areas including: (1) evaluation and development of statistical procedures for the analysis of death rate data, disease incidence data, and large scale data sets; (2) development of dose response and demographic models useful in the prediction of the health effects of energy technologies; (3) application of our method and models to analyses of the health risks of energy production; (4) a reanalysis of the Tri-State leukemia survey data, focusing on the relationship between myelogenous leukemia risk and diagnostic x-ray exposure; and (5) investigation of human birth weights as a possible early warning system for the effects of environmental pollution

  2. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Cabalin, L.M.; Gonzalez, A. [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain); Ruiz, J. [Department of Applied Physics I, University of Malaga, E-29071 Malaga (Spain); Laserna, J.J., E-mail: laserna@uma.e [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain)

    2010-08-15

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s{sup -1}. Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  3. ASSESSMENT OF COMBINED HEAT AND POWER SYSTEM "PREMIUM POWER" APPLICATIONS IN CALIFORNIA

    OpenAIRE

    Norwood, Zack

    2010-01-01

    The effectiveness of combined heat and power (CHP) systems for power interruption intolerant, "premium power," facilities is the focus of this study. Through three real-world case studies and economic cost minimization modeling, the economic and environmental performance of "premium power" CHP is analyzed. The results of the analysis for a brewery, data center, and hospital lead to some interesting conclusions about CHP limited to the specific CHP technologies installed at those sites. Firs...

  4. Remaining lifetime assessment of power plant steam boilers

    Energy Technology Data Exchange (ETDEWEB)

    Liska, V. (Skoda Research Ltd, Plzen (Czech Republic)); Mentl, V. (Univ. of West Bohemia, Dept. Material Science and Technology, Plzen (Czech Republic))

    2010-05-15

    The energy producing power plants are designed for operational period of 20, 30 years. During this period, inspections are realized to investigate the operational capability of the respective components and the plant as a whole, and when the designed time is approaching its limit, the crucial questions are raised with respect to the following possible operation, its safety and risks that stem from the fact that the continuous degradation of material properties occurred during the longtime service as a result of service conditions, e.g. high temperatures, fatigue loading etc. In opposite to the non-destructive techniques, accelerated creep to rupture tests of high temperature boiler components, e.g. high temperature headers, can give quantitative results as far as the remaining lifetime of the component is concerned. Several steam turbine boilers were inspected according to the customer's demand to evaluate the remaining lifetime of the boilers that were operated more than 160 000 and 200 000 hours respectively. The evaluation was based on an extensive NDT expection and the measurement of mechanical properties (including creep test data) of high temperature components. Making use of the Larson-Miller parameter in comparison with replica testing made it possible to evaluate quantitatively the lifetime exhaustion, to make an assessment of the remaining lifetime and to make a recommendation as far as the future inspection intervals of the boilers are concerned. On the basis of accelerated creep test data performed on the degraded materials, the remaining lifetime hours were calculated for the three 'safety' situations: (1) 'ZERO SAFETY' (neither recommended k=1,5 safety coefficient for working stress nor +70degC increase of working temperature were taken into consideration). (2) 'STRESS SAFETY' (1,5 safety coefficient for working stress and real working temperature were taken into consideration). (3) 'FULL SAFETY' (both 1

  5. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment

    NARCIS (Netherlands)

    Henriksson, P.J.G.; Rico Artero, A.; Zhang, W.; Nahid, S.S.A.; Newton, R.; Phan, L.T.; Zhang, Z.

    2015-01-01

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aq

  6. Developing Statistical Models to Assess Transplant Outcomes Using National Registries: The Process in the United States.

    Science.gov (United States)

    Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L

    2016-02-01

    Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field. PMID:26814440

  7. Statistical Classification for Cognitive Diagnostic Assessment: An Artificial Neural Network Approach

    Science.gov (United States)

    Cui, Ying; Gierl, Mark; Guo, Qi

    2016-01-01

    The purpose of the current investigation was to describe how the artificial neural networks (ANNs) can be used to interpret student performance on cognitive diagnostic assessments (CDAs) and evaluate the performances of ANNs using simulation results. CDAs are designed to measure student performance on problem-solving tasks and provide useful…

  8. Environmental Impact Assessment for Olkiluoto 4 Nuclear Power Plant Unit in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Dersten, Riitta; Gahmberg, Sini; Takala, Jenni [Teollisuuden Voima Oyj, Olkiluoto, FI-27160 Eurajoki (Finland)

    2008-07-01

    In order to improve its readiness for constructing additional production capacity, Teollisuuden Voima Oyj (TVO) initiated in spring 2007 the environmental impact assessment procedure (EIA procedure) concerning a new nuclear power plant unit that would possibly be located at Olkiluoto. When assessing the environmental impacts of the Olkiluoto nuclear power plant extension project, the present state of the environment was first examined, and after that, the changes caused by the projects as well as their significance were assessed, taking into account the combined impacts of the operations at Olkiluoto. The environmental impact assessment for the planned nuclear power plant unit covers the entire life cycle of the plant unit. (authors)

  9. Multivariate Statistical Analysis: a tool for groundwater quality assessment in the hidrogeologic region of the Ring of Cenotes, Yucatan, Mexico.

    Science.gov (United States)

    Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.

    2014-12-01

    The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.

  10. Hydrogeochemical assessment of groundwater quality in a river delta using multivariate statistical techniques

    Science.gov (United States)

    Matiatos, Ioannis; Paraskevopoulou, Vasiliki; Botsou, Fotini; Dassenakis, Manolis; Lazogiannis, Konstantinos; Ghionis, George; Poulos, Serafim

    2016-04-01

    The knowledge of the factors controlling the regional groundwater quality regime is important for planning and management of the groundwater resources. This work applies conventional hydrogeochemical and multivariate statistical techniques to identify the main factors and mechanisms controlling the hydrogeochemistry of groundwater in the deltaic environment of River Pinios (Thessaly) as well as possible areas of interactions between groundwater and surface water bodies. Hierarchical Cluster Analysis (HCA) and Principal Components Analysis (PCA) are performed using a data set of physical-chemical parameters from surface water and groundwater sites. Through HCA the paper's objective is to group together surface water and groundwater monitoring sites based on similarities in hydrochemistry in order to indicate areas of groundwater-surface water interaction. On the other hand, PCA aims at indicating factors responsible for the hydrogeochemical characteristics of the water bodies in the river delta (e.g., water-rock interaction, seawater intrusion, anthropogenic activities).

  11. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in the Terengganu River Basin

    International Nuclear Information System (INIS)

    Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH3-NL, PO4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)

  12. The Good, the Bad and the Ugly: Statistical quality assessment of SZ detections

    CERN Document Server

    Aghanim, N; Diego, J -M; Douspis, M; Macias-Perez, J; Pointecouteau, E; Comis, B; Arnaud, M; Montier, L

    2014-01-01

    We examine three approaches to the problem of source classification in catalogues. Our goal is to determine the confidence with which the elements in these catalogues can be distinguished in populations on the basis of their spectral energy distribution (SED). Our analysis is based on the projection of the measurements onto a comprehensive SED model of the main signals in the considered range of frequencies. We first first consider likelihood analysis, which half way between supervised and unsupervised methods. Next, we investigate an unsupervised clustering technique. Finally, we consider a supervised classifier based on Artificial Neural Networks. We illustrate the approach and results using catalogues from various surveys. i.e., X-Rays (MCXC), optical (SDSS) and millimetric (Planck Sunyaev-Zeldovich (SZ)). We show that the results from the statistical classifications of the three methods are in very good agreement with each others, although the supervised neural network-based classification shows better pe...

  13. A statistical method for assessing peptide identification confidence in accurate mass and time tag proteomics.

    Science.gov (United States)

    Stanley, Jeffrey R; Adkins, Joshua N; Slysz, Gordon W; Monroe, Matthew E; Purvine, Samuel O; Karpievitch, Yuliya V; Anderson, Gordon A; Smith, Richard D; Dabney, Alan R

    2011-08-15

    Current algorithms for quantifying peptide identification confidence in the accurate mass and time (AMT) tag approach assume that the AMT tags themselves have been correctly identified. However, there is uncertainty in the identification of AMT tags, because this is based on matching LC-MS/MS fragmentation spectra to peptide sequences. In this paper, we incorporate confidence measures for the AMT tag identifications into the calculation of probabilities for correct matches to an AMT tag database, resulting in a more accurate overall measure of identification confidence for the AMT tag approach. The method is referenced as Statistical Tools for AMT Tag Confidence (STAC). STAC additionally provides a uniqueness probability (UP) to help distinguish between multiple matches to an AMT tag and a method to calculate an overall false discovery rate (FDR). STAC is freely available for download, as both a command line and a Windows graphical application.

  14. Multivariate statistical assessment of heavy metal pollution sources of groundwater around a lead and zinc plant.

    Science.gov (United States)

    Zamani, Abbas Ali; Yaftian, Mohammad Reza; Parizanganeh, Abdolhossein

    2012-01-01

    The contamination of groundwater by heavy metal ions around a lead and zinc plant has been studied. As a case study groundwater contamination in Bonab Industrial Estate (Zanjan-Iran) for iron, cobalt, nickel, copper, zinc, cadmium and lead content was investigated using differential pulse polarography (DPP). Although, cobalt, copper and zinc were found correspondingly in 47.8%, 100.0%, and 100.0% of the samples, they did not contain these metals above their maximum contaminant levels (MCLs). Cadmium was detected in 65.2% of the samples and 17.4% of them were polluted by this metal. All samples contained detectable levels of lead and iron with 8.7% and 13.0% of the samples higher than their MCLs. Nickel was also found in 78.3% of the samples, out of which 8.7% were polluted. In general, the results revealed the contamination of groundwater sources in the studied zone. The higher health risks are related to lead, nickel, and cadmium ions. Multivariate statistical techniques were applied for interpreting the experimental data and giving a description for the sources. The data analysis showed correlations and similarities between investigated heavy metals and helps to classify these ion groups. Cluster analysis identified five clusters among the studied heavy metals. Cluster 1 consisted of Pb, Cu, and cluster 3 included Cd, Fe; also each of the elements Zn, Co and Ni was located in groups with single member. The same results were obtained by factor analysis. Statistical investigations revealed that anthropogenic factors and notably lead and zinc plant and pedo-geochemical pollution sources are influencing water quality in the studied area. PMID:23369182

  15. Multivariate statistical assessment of heavy metal pollution sources of groundwater around a lead and zinc plant.

    Science.gov (United States)

    Zamani, Abbas Ali; Yaftian, Mohammad Reza; Parizanganeh, Abdolhossein

    2012-12-17

    The contamination of groundwater by heavy metal ions around a lead and zinc plant has been studied. As a case study groundwater contamination in Bonab Industrial Estate (Zanjan-Iran) for iron, cobalt, nickel, copper, zinc, cadmium and lead content was investigated using differential pulse polarography (DPP). Although, cobalt, copper and zinc were found correspondingly in 47.8%, 100.0%, and 100.0% of the samples, they did not contain these metals above their maximum contaminant levels (MCLs). Cadmium was detected in 65.2% of the samples and 17.4% of them were polluted by this metal. All samples contained detectable levels of lead and iron with 8.7% and 13.0% of the samples higher than their MCLs. Nickel was also found in 78.3% of the samples, out of which 8.7% were polluted. In general, the results revealed the contamination of groundwater sources in the studied zone. The higher health risks are related to lead, nickel, and cadmium ions. Multivariate statistical techniques were applied for interpreting the experimental data and giving a description for the sources. The data analysis showed correlations and similarities between investigated heavy metals and helps to classify these ion groups. Cluster analysis identified five clusters among the studied heavy metals. Cluster 1 consisted of Pb, Cu, and cluster 3 included Cd, Fe; also each of the elements Zn, Co and Ni was located in groups with single member. The same results were obtained by factor analysis. Statistical investigations revealed that anthropogenic factors and notably lead and zinc plant and pedo-geochemical pollution sources are influencing water quality in the studied area.

  16. Multivariate statistical assessment of heavy metal pollution sources of groundwater around a lead and zinc plant

    Directory of Open Access Journals (Sweden)

    Zamani Abbas Ali

    2012-12-01

    Full Text Available Abstract The contamination of groundwater by heavy metal ions around a lead and zinc plant has been studied. As a case study groundwater contamination in Bonab Industrial Estate (Zanjan-Iran for iron, cobalt, nickel, copper, zinc, cadmium and lead content was investigated using differential pulse polarography (DPP. Although, cobalt, copper and zinc were found correspondingly in 47.8%, 100.0%, and 100.0% of the samples, they did not contain these metals above their maximum contaminant levels (MCLs. Cadmium was detected in 65.2% of the samples and 17.4% of them were polluted by this metal. All samples contained detectable levels of lead and iron with 8.7% and 13.0% of the samples higher than their MCLs. Nickel was also found in 78.3% of the samples, out of which 8.7% were polluted. In general, the results revealed the contamination of groundwater sources in the studied zone. The higher health risks are related to lead, nickel, and cadmium ions. Multivariate statistical techniques were applied for interpreting the experimental data and giving a description for the sources. The data analysis showed correlations and similarities between investigated heavy metals and helps to classify these ion groups. Cluster analysis identified five clusters among the studied heavy metals. Cluster 1 consisted of Pb, Cu, and cluster 3 included Cd, Fe; also each of the elements Zn, Co and Ni was located in groups with single member. The same results were obtained by factor analysis. Statistical investigations revealed that anthropogenic factors and notably lead and zinc plant and pedo-geochemical pollution sources are influencing water quality in the studied area.

  17. Undersampling power-law size distributions: effect on the assessment of extreme natural hazards

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.

    2014-01-01

    The effect of undersampling on estimating the size of extreme natural hazards from historical data is examined. Tests using synthetic catalogs indicate that the tail of an empirical size distribution sampled from a pure Pareto probability distribution can range from having one-to-several unusually large events to appearing depleted, relative to the parent distribution. Both of these effects are artifacts caused by limited catalog length. It is more difficult to diagnose the artificially depleted empirical distributions, since one expects that a pure Pareto distribution is physically limited in some way. Using maximum likelihood methods and the method of moments, we estimate the power-law exponent and the corner size parameter of tapered Pareto distributions for several natural hazard examples: tsunamis, floods, and earthquakes. Each of these examples has varying catalog lengths and measurement thresholds, relative to the largest event sizes. In many cases where there are only several orders of magnitude between the measurement threshold and the largest events, joint two-parameter estimation techniques are necessary to account for estimation dependence between the power-law scaling exponent and the corner size parameter. Results indicate that whereas the corner size parameter of a tapered Pareto distribution can be estimated, its upper confidence bound cannot be determined and the estimate itself is often unstable with time. Correspondingly, one cannot statistically reject a pure Pareto null hypothesis using natural hazard catalog data. Although physical limits to the hazard source size and by attenuation mechanisms from source to site constrain the maximum hazard size, historical data alone often cannot reliably determine the corner size parameter. Probabilistic assessments incorporating theoretical constraints on source size and propagation effects are preferred over deterministic assessments of extreme natural hazards based on historic data.

  18. Green Power Marketing in Retail Competition: An Early Assessment

    International Nuclear Information System (INIS)

    Green power marketing-the business of selling electricity products or services based in part on their environmental values-is still in an early stage of development. This Topical Issues Brief presents a summary of early results with green power marketing under retail competition, covering both fully competitive markets and relevant direct access pilot programs. The brief provides an overview of green products that are or were offered, and discusses consumers' interest in these products. Critical issues that will impact the availability and success of green power products under retail competition are highlighted

  19. Considerations on Cyber Security Assessments of Korean Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung-Woon; Song, Jae-Gu; Han, Kyung-Soo; Lee, Cheol Kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kang, Mingyun [E-Gonggam Co. Ltd., Daejeon (Korea, Republic of)

    2015-10-15

    Korea Institute of Nuclear Nonproliferation and Control (KINAC) has prepared the regulatory standard RS-015 based on RG 5.71. RS-015 defines the elements of a cyber security program to be established in nuclear facilities and describes the security control items and relevant requirements. Cyber security assessments are important initial activities in a cyber security program for NPPs. Cyber security assessments can be performed in the following key steps: 1) Formation of a cyber security assessment team (CSAT); 2) Identification of critical systems and critical digital assets (CDAs); 3) Plant compliance checks with the security control requirements in RS-015. Through the assessments, the current status of security controls applied to NPPs can be found out. The assessments provide baseline data for remedial activities. Additional analyses with the results from the assessments should be performed before the implementation of remedial security controls. The cyber security team at the Korea Atomic Energy Research Institute (KAERI) has studied how to perform cyber security assessments for NPPs based on the regulatory requirements. Recently, KAERI's cyber security team has performed pilot cyber security assessments of a Korean NPP. Based on this assessment experience, considerations and checkpoints which would be helpful for full-scale cyber security assessments of Korean NPPs and the implementation of remedial security controls are discussed in this paper. Cyber security assessment is one of important and immediate activities for NPP cyber security. The quality of the first assessment will be a barometer for NPP cyber security. Hence cyber security assessments of Korean NPPs should be performed elaborately.

  20. Life cycle assessment for coordination development of nuclear power and electric vehicle

    International Nuclear Information System (INIS)

    Energy, environment and climate change have become focus political topics. In this paper, the life cycle assessment for cooperation development of nuclear power and electric vehicle were analyzed from the view of energy efficiency and pollutant emissions. The assessment results show that the pathway of nuclear power coupled with electric vehicle is better than coal electric power coupled with electric vehicle and normal gasoline coupled with internal combustion engine powered vehicle in terms of the environmental and energy characteristics. To charge the electric vehicle, instead of water power station, can safeguard the stable operation of nuclear power station. The results could provide consulted for coordination development of nuclear power, electric vehicle and brain power electric net. (authors)

  1. Surface water quality assessment by the use of combination of multivariate statistical classification and expert information.

    Science.gov (United States)

    Tobiszewski, M; Tsakovski, S; Simeonov, V; Namieśnik, J

    2010-08-01

    The present study deals with the assessment of surface water quality from an industrial-urban region located in northern Poland near to the city of Gdansk. Concentrations of thirteen chemicals including total polycyclic aromatic hydrocarbons (PAHs), halogenated volatile organic compounds (HVOCs) and major ions in the samples collected at five sampling points during six campaigns were used as variables throughout the study. The originality in the monitoring data treatment and interpretation was the combination of a traditional classification approach (self-organizing maps of Kohonen) with PAH diagnostic ratios expertise to achieve a reliable pollution source identification. Thus, sampling points affected by pollution from traffic (petroleum combustion products), from crude oil processing (petroleum release related compounds), and from phosphogypsum disposal site were properly discriminated. Additionally, it is shown that this original assessment approach can be useful in finding specific pollution source tracers.

  2. Local homogeneity combined with DCT statistics to blind noisy image quality assessment

    Science.gov (United States)

    Yang, Lingxian; Chen, Li; Chen, Heping

    2015-03-01

    In this paper a novel method for blind noisy image quality assessment is proposed. First, it is believed that human visual system (HVS) is more sensitive to the local smoothness area in a noise image, an adaptively local homogeneous block selection algorithm is proposed to construct a new homogeneous image named as homogeneity blocks (HB) based on computing each pixel characteristic. Second, applying the discrete cosine transform (DCT) for each HB and using high frequency component to evaluate image noise level. Finally, a modified peak signal to noise ratio (MPSNR) image quality assessment approach is proposed based on analysis DCT kurtosis distributions change and noise level above-mentioned. Simulations show that the quality scores that produced from the proposed algorithm are well correlated with the human perception of quality and also have a stability performance.

  3. Assessment of Environmental External Effects in Power Generation

    DEFF Research Database (Denmark)

    Meyer, Henrik Jacob; Morthorst, Poul Erik; Ibsen, Liselotte Schleisner;

    1996-01-01

    This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetise the external effects in the production of energy, especially in relation to renewable energy...... technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared...... to the production of electricity based on a coal fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas.In the report the individual externalities from...

  4. Statistical Inference under Latent Class Models, with Application to Risk Assessment in Cancer Survivorship Studies

    OpenAIRE

    Wang, Huijing

    2015-01-01

    Motivated by a cancer survivorship program, this PhD thesis aims to develop methodology for risk assessment, classification, and prediction. We formulate the primary data collected from a cohort with two underlying categories, the at-risk and not-at-risk classes, using latent class models, and we conduct both cross-sectional and longitudinal analyses. We begin with a maximum pseudo-likelihood estimator (pseudo-MLE) as an alternative to the maximum likelihood estimator (MLE) under a mixture Po...

  5. Fuel consumption and fire emissions estimates using Fire Radiative Power, burned area and statistical modelling on the fire event scale

    Science.gov (United States)

    Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja

    2016-04-01

    Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.

  6. Combining the Power of Statistical Analyses and Community Interviews to Identify Adoption Barriers for Stormwater Best-Management Practices

    Science.gov (United States)

    Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.

    2015-12-01

    Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.

  7. Satellite power system: Concept development and evaluation program. Volume 3: Power transmission and reception. Technical summary and assessment

    Science.gov (United States)

    Dietz, R. H.; Arndt, G. D.; Seyl, J. W.; Leopold, L.; Kelley, J. S.

    1981-01-01

    Efforts in the DOE/NASA concept development and evaluation program are discussed for the solar power satellite power transmission and reception system. A technical summary is provided together with a summary of system assessment activities. System options and system definition drivers are described. Major system assessment activities were in support of the reference system definition, solid state system studies, critical technology supporting investigations, and various system and subsystem tradeoffs. These activities are described together with reference system updates and alternative concepts for each of the subsystem areas. Conclusions reached as a result of the numerous analytical and experimental evaluations are presented. Remaining issues for a possible follow-on program are identified.

  8. Life cycle assessment of fossil and biomass power generation chains. An analysis carried out for ALSTOM Power Services

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Ch.

    2008-12-15

    This final report issued by the Technology Assessment Department of the Paul Scherrer Institute (PSI) reports on the results of an analysis carried out on behalf of the Alstom Power Services company. Fossil and biomass chains as well as co-combustion power plants are assessed. The general objective of this analysis is an evaluation of specific as well as overall environmental burdens resulting from these different options for electricity production. The results obtained for fuel chains including hard coal, lignite, wood, natural gas and synthetic natural gas are discussed. An overall comparison is made and the conclusions drawn from the results of the analysis are presented.

  9. Using integrated multivariate statistics to assess the hydrochemistry of surface water quality, Lake Taihu basin, China

    Directory of Open Access Journals (Sweden)

    Xiangyu Mu

    2014-09-01

    Full Text Available Natural factors and anthropogenic activities both contribute dissolved chemical loads to  lakes and streams.  Mineral solubility,  geomorphology of the drainage basin, source strengths and climate all contribute to concentrations and their variability. Urbanization and agriculture waste-water particularly lead to aquatic environmental degradation. Major contaminant sources and controls on water quality can be asssessed by analyzing the variability in proportions of major and minor solutes in water coupled to mutivariate statistical methods.   The demand for freshwater needed for increasing crop production puulation and industrialization occurs almost everywhere in in China and these conflicting needs have led to widespread water contamination. Because of heavy nutrient loadings from all of these sources, Lake Taihu (eastern China notably suffers periodic hyper-eutrophication and drinking water deterioration, which has led to shortages of freshwater for the City of Wuxi and other nearby cities. This lake, the third largest freshwater body in China, has historically beeen considered a cultural treasure of China, and has supported long-term fisheries. The is increasing pressure to remediate the present contamination which compromises both aquiculture and the prior economic base centered on tourism.  However, remediation cannot be effectively done without first characterizing the broad nature of the non-point source pollution. To this end, we investigated the hydrochemical setting of Lake Taihu to determine how different land use types influence the variability of surface water chemistry in different water sources to the lake. We found that waters broadly show wide variability ranging from  calcium-magnesium-bicarbonate hydrochemical facies type to mixed sodium-sulfate-chloride type. Principal components analysis produced three principal components that explained 78% of the variance in the water quality and reflect three major types of water

  10. Statistical downscaling of the French Mediterranean climate: assessment for present and projection in an anthropogenic scenario

    Directory of Open Access Journals (Sweden)

    C. Lavaysse

    2012-03-01

    Full Text Available The Mediterranean basin is a particularly vulnerable region to climate change, featuring a sharply contrasted climate between the North and South and governed by a semi-enclosed sea with pronounced surrounding topography covering parts of the Europe, Africa and Asia regions. The physiographic specificities contribute to produce mesoscale atmospheric features that can evolve to high-impact weather systems such as heavy precipitation, wind storms, heat waves and droughts. The evolution of these meteorological extremes in the context of global warming is still an open question, partly because of the large uncertainty associated with existing estimates produced by global climate models (GCM with coarse horizontal resolution (~200 km. Downscaling climatic information at a local scale is, thus, needed to improve the climate extreme prediction and to provide relevant information for vulnerability and adaptation studies. In this study, we investigate wind, temperature and precipitation distributions for past recent climate and future scenarios at eight meteorological stations in the French Mediterranean region using one statistical downscaling model, referred as the "Cumulative Distribution Function transform" (CDF-t approach. A thorough analysis of the uncertainty associated with statistical downscaling and bi-linear interpolation of large-scale wind speed, temperature and rainfall from reanalyses (ERA-40 and three GCM historical simulations, has been conducted and quantified in terms of Kolmogorov-Smirnov scores. CDF-t produces a more accurate and reliable local wind speed, temperature and rainfall. Generally, wind speed, temperature and rainfall CDF obtained with CDF-t are significantly similar with the observed CDF, even though CDF-t performance may vary from one station to another due to the sensitivity of the driving large-scale fields or local impact. CDF-t has then been applied to climate simulations of the 21st century under B1 and A2 scenarios

  11. Statistical assessment of DNA extraction reagent lot variability in real-time quantitative PCR

    Science.gov (United States)

    Bushon, R.N.; Kephart, C.M.; Koltun, G.F.; Francy, D.S.; Schaefer, F. W.; Lindquist, H.D. Alan

    2010-01-01

    Aims: The aim of this study was to evaluate the variability in lots of a DNA extraction kit using real-time PCR assays for Bacillus anthracis, Francisella tularensis and Vibrio cholerae. Methods and Results: Replicate aliquots of three bacteria were processed in duplicate with three different lots of a commercial DNA extraction kit. This experiment was repeated in triplicate. Results showed that cycle threshold values were statistically different among the different lots. Conclusions: Differences in DNA extraction reagent lots were found to be a significant source of variability for qPCR results. Steps should be taken to ensure the quality and consistency of reagents. Minimally, we propose that standard curves should be constructed for each new lot of extraction reagents, so that lot-to-lot variation is accounted for in data interpretation. Significance and Impact of the Study: This study highlights the importance of evaluating variability in DNA extraction procedures, especially when different reagent lots are used. Consideration of this variability in data interpretation should be an integral part of studies investigating environmental samples with unknown concentrations of organisms. ?? 2010 The Society for Applied Microbiology.

  12. Statistical Assessment of Shapes and Magnetic Field Orientations in Molecular Clouds through Polarization Observations

    CERN Document Server

    Tassis, K; Hildebrand, R H; Kirby, L; Vaillancourt, J E

    2009-01-01

    We present a novel statistical analysis aimed at deriving the intrinsic shapes and magnetic field orientations of molecular clouds using dust emission and polarization observations by the Hertz polarimeter. Our observables are the aspect ratio of the projected plane-of-the-sky cloud image, and the angle between the mean direction of the plane-of-the-sky component of the magnetic field and the short axis of the cloud image. To overcome projection effects due to the unknown orientation of the line-of-sight, we combine observations from 24 clouds, assuming that line-of-sight orientations are random and all are equally probable. Through a weighted least-squares analysis, we find that the best-fit intrinsic cloud shape describing our sample is an oblate disk with only small degrees of triaxiality. The best-fit intrinsic magnetic field orientation is close to the direction of the shortest cloud axis, with small (~24 deg) deviations toward the long/middle cloud axes. However, due to the small number of observed clou...

  13. Calibration and assessment of Swarm ion drift measurements using a comparison with a statistical convection model

    Science.gov (United States)

    Fiori, R. A. D.; Koustov, A. V.; Boteler, D. H.; Knudsen, D. J.; Burchill, J. K.

    2016-06-01

    The electric field instruments onboard the Swarm satellites make high-resolution measurements of the F-region ion drift. This paper presents an initial investigation of preliminary ion drift data made available by the European Space Agency. Based on data taken during polar cap crossings, we identify large offsets in both the along-track and cross-track components of the measured ion drift. These offsets are removed by zeroing drift values at the low-latitude boundary of the high-latitude convection pattern. This correction is shown to significantly improve agreement between the Swarm ion drift measurements and velocity inferred from a radar-based statistical convection model for periods of quasi-stability in the solar wind and interplanetary magnetic field. Agreement is most pronounced in the cross-track direction ( R = 0.60); it improves slightly ( R = 0.63) if data are limited to periods with IMF B z < 0. The corrected Swarm data were shown to properly identify the convection reversal boundary for periods of IMF B z < 0, in full agreement with previous radar and satellite measurements, making Swarm ion drift measurements a valuable input for ionospheric modeling.

  14. Energy Storage for Power Systems Applications: A Regional Assessment for the Northwest Power Pool (NWPP)

    Energy Technology Data Exchange (ETDEWEB)

    Kintner-Meyer, Michael CW; Balducci, Patrick J.; Jin, Chunlian; Nguyen, Tony B.; Elizondo, Marcelo A.; Viswanathan, Vilayanur V.; Guo, Xinxin; Tuffner, Francis K.

    2010-04-01

    Wind production, which has expanded rapidly in recent years, could be an important element in the future efficient management of the electric power system; however, wind energy generation is uncontrollable and intermittent in nature. Thus, while wind power represents a significant opportunity to the Bonneville Power Administration (BPA), integrating high levels of wind resources into the power system will bring great challenges to generation scheduling and in the provision of ancillary services. This report addresses several key questions in the broader discussion on the integration of renewable energy resources in the Pacific Northwest power grid. More specifically, it addresses the following questions: a) how much total reserve or balancing requirements are necessary to accommodate the simulated expansion of intermittent renewable energy resources during the 2019 time horizon, and b) what are the most cost effective technological solutions for meeting load balancing requirements in the Northwest Power Pool (NWPP).

  15. Assessment of uncertainty in full core reactor physics calculations using statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    McEwan, C., E-mail: mcewac2@mcmaster.ca [McMaster Univ., Hamilton, Ontario (Canada)

    2012-07-01

    The best estimate method of safety analysis involves choosing a realistic set of input parameters for a proposed safety case and evaluating the uncertainty in the results. Determining the uncertainty in code outputs remains a challenge and is the subject of a benchmarking exercise proposed by the Organization for Economic Cooperation and Development. The work proposed in this paper will contribute to this benchmark by assessing the uncertainty in a depletion calculation of the final nuclide concentrations for an experiment performed in the Fukushima-2 reactor. This will be done using lattice transport code DRAGON and a tool known as DINOSAUR. (author)

  16. Hydrochemical assessment of Semarang area using multivariate statistics: A sample based dataset

    OpenAIRE

    Irawan, Dasapta Erwin; Putranto, Thomas Triadi

    2016-01-01

    The following paper describes in brief the data set related to our project "Hydrochemical assessment of Semarang Groundwater Quality". All of 58 samples were taken in 1992, 1993, 2003, 2006, and 2007 using well point data from several reports from Ministry of Energy and Min- eral Resources and independent consultants. We provided 20 parameters in each samples (sample id, coord X, coord Y, well depth, water level, water elevation, TDS, pH, EC, K, Ca, Na, Mg, Cl, SO4, HCO3, ye...

  17. Safety assessment and verification for nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    This Safety Guide was prepared under the IAEA programme for safety standards for nuclear power plants. The present publication is a revision of the IAEA Safety Guide on Management of Nuclear Power Plants for Safe Operation issued in 1984. It supplements Section 2 of the Safety Requirements publication on Safety of Nuclear Power Plants: Operation. Nuclear power technology is different from the customary technology of power generation from fossil fuel and by hydroelectric means. One major difference between the management of nuclear power plants and that of conventional generating plants is the emphasis that should be placed on nuclear safety, quality assurance, the management of radioactive waste and radiological protection, and the accompanying national regulatory requirements. This Safety Guide highlights the important elements of effective management in relation to these aspects of safety. The attention to be paid to safety requires that the management recognize that personnel involved in the nuclear power programme should understand, respond effectively to, and continuously search for ways to enhance safety in the light of any additional requirements socially and legally demanded of nuclear energy. This will help to ensure that safety policies that result in the safe operation of nuclear power plants are implemented and that margins of safety are always maintained. The structure of the organization, management standards and administrative controls should be such that there is a high degree of assurance that safety policies and decisions are implemented, safety is continuously enhanced and a strong safety culture is promoted and supported. The objective of this publication is to guide Member States in setting up an operating organization which facilitates the safe operation of nuclear power plants to a high level internationally. The second objective is to provide guidance on the most important organizational elements in order to contribute to a strong safety

  18. Determining the Suitability of Two Different Statistical Techniques in Shallow Landslide (Debris Flow Initiation Susceptibility Assessment in the Western Ghats

    Directory of Open Access Journals (Sweden)

    M. V. Ninu Krishnan

    2015-01-01

    Full Text Available In the present study, the Information Value (InfoVal and the Multiple Logistic Regression (MLR methods based on bivariate and multivariate statistical analysis have been applied for shallow landslide initiation susceptibility assessment in a selected subwatershed in the Western Ghats, Kerala, India, to determine the suitability of geographical information systems (GIS assisted statistical landslide susceptibility assessment methods in the data constrained regions. The different landslide conditioning terrain variables considered in the analysis are geomorphology, land use/land cover, soil thickness, slope, aspect, relative relief, plan curvature, profile curvature, drainage density, the distance from drainages, lineament density and distance from lineaments. Landslide Susceptibility Index (LSI maps were produced by integrating the weighted themes and divided into five landslide susceptibility zones (LSZ by correlating the LSI with general terrain conditions. The predictive performances of the models were evaluated through success and prediction rate curves. The area under success rate curves (AUC for InfoVal and MLR generated susceptibility maps shows 84.11% and 68.65%, respectively. The prediction rate curves show good to moderate correlation between the distribution of the validation group of landslides and LSZ maps with AUC values of 0.648 and 0.826 respectively for MLR and InfoVal produced LSZ maps. Considering the best fit and suitability of the models in the study area by quantitative prediction accuracy, LSZ map produced by the InfoVal technique shows higher accuracy, i.e. 82.60%, than the MLR model and is more realistic while compared in the field and is considered as the best suited model for the assessment of landslide susceptibility in areas similar to the study area. The LSZ map produced for the area can be utilised for regional planning and assessment process, by incorporating the generalised rainfall conditions in the area. DOI

  19. Assessment of wind power potential at Hawksbay, Karachi Sindh, Pakistan

    Directory of Open Access Journals (Sweden)

    Shahnawaz Farhan Khahro

    2013-07-01

    Full Text Available Abstract Pakistan is facing serious energy crisis at present. The government is aiming to utilize the immense potential of renewable energy sources like: Solar, Wind, etc, in addition to intensify the conventional sources of energy to over the acute shortage of energy. Wind energy is the fastest-developing energy source worldwide. The aim of this paper is to explore and estimate the wind power potential of Hawksbay Karachi, one of the locations in southern part of Pakistan. Wind speed data (in meters per second from April 2009 to April 2011 at four different heights is measured. Wind power densities, frequency distribution, and Weibull distribution of wind speed are calculated in this study. This study also presents the analysis and comparison of 5 numerical methods to determine the Weibull scale and shape parameters for the available wind data. The estimated wind power to be generated through commercial wind turbine is also included. The yearly mean wind speed at Hawksbay, Karachi is 5.9m/s and has power density of 197W/m2 at 80m height with high power density during April to August. The estimated cost per kWh is US$0.0345. Therefore the site may be considered suitable for wind turbine applications.

  20. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    Science.gov (United States)

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  1. Perinatal Health Statistics as the Basis for Perinatal Quality Assessment in Croatia

    Directory of Open Access Journals (Sweden)

    Urelija Rodin

    2015-01-01

    Full Text Available Context. Perinatal mortality indicators are considered the most important measures of perinatal outcome. The indicators reliability depends on births and deaths reporting and recording. Many publications focus on perinatal deaths underreporting and misclassification, disabling proper international comparisons. Objective. Description of perinatal health care quality assessment key indicators in Croatia. Methods. Retrospective review of reports from all maternities from 2001 to 2014. Results. According to reporting criteria for birth weight ≥500 g, perinatal mortality (PNM was reduced by 31%, fetal mortality (FM by 32%, and early neonatal mortality (ENM by 29%. According to reporting criteria for ≥1000 g, PNM was reduced by 43%, FM by 36%, and ENM by 54%. PNM in ≥22 weeks’ (wks gestational age (GA was reduced by 28%, FM by 30%, and ENM by 26%. The proportion of FM at 32–36 wks GA and at term was the highest between all GA subgroups, as opposed to ENM with the highest proportion in 22–27 wks GA. Through the period, the maternal mortality ratio varied from 2.4 to 14.3/100,000 live births. The process indicators have been increased in number by more than half since 2001, the caesarean deliveries from 11.9% in 2001 to 19.6% in 2014. Conclusions. The comprehensive perinatal health monitoring represents the basis for the perinatal quality assessment.

  2. Radiological Assessment for the Removal of Legacy BPA Power Lines that Cross the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Millsap, William J.; Brush, Daniel J.

    2013-11-13

    This paper discusses some radiological field monitoring and assessment methods used to assess the components of an old electrical power transmission line that ran across the Hanford Site between the production reactors area (100 Area) and the chemical processing area (200 Area). This task was complicated by the presence of radon daughters -- both beta and alpha emitters -- residing on the surfaces, particularly on the surfaces of weathered metals and metals that had been electrically-charged. In many cases, these activities were high compared to the DOE Surface Contamination Guidelines, which were used as guides for the assessment. These methods included the use of the Toulmin model of argument, represented using Toulmin diagrams, to represent the combined force of several strands of evidences, rather than a single measurement of activity, to demonstrate beyond a reasonable doubt that no or very little Hanford activity was present and mixed with the natural activity. A number of forms of evidence were used: the overall chance of Hanford contamination; measurements of removable activity, beta and alpha; 1-minute scaler counts of total surface activity, beta and alpha, using "background makers"; the beta activity to alpha activity ratios; measured contamination on nearby components; NaI gamma spectral measurements to compare uncontaminated and potentially-contaminated spectra, as well as measurements for the sentinel radionuclides, Am- 241 and Cs-137 on conducting wire; comparative statistical analyses; and in-situ measurements of alpha spectra on conducting wire showing that the alpha activity was natural Po-210, as well as to compare uncontaminated and potentially-contaminated spectra.

  3. Radiological Assessment for the Removal of Legacy BPA Power Lines that Cross the Hanford Site

    International Nuclear Information System (INIS)

    This paper discusses some radiological field monitoring and assessment methods used to assess the components of an old electrical power transmission line that ran across the Hanford Site between the production reactors area (100 Area) and the chemical processing area (200 Area). This task was complicated by the presence of radon daughters -- both beta and alpha emitters -- residing on the surfaces, particularly on the surfaces of weathered metals and metals that had been electrically-charged. In many cases, these activities were high compared to the DOE Surface Contamination Guidelines, which were used as guides for the assessment. These methods included the use of the Toulmin model of argument, represented using Toulmin diagrams, to represent the combined force of several strands of evidences, rather than a single measurement of activity, to demonstrate beyond a reasonable doubt that no or very little Hanford activity was present and mixed with the natural activity. A number of forms of evidence were used: the overall chance of Hanford contamination; measurements of removable activity, beta and alpha; 1-minute scaler counts of total surface activity, beta and alpha, using 'background makers'; the beta activity to alpha activity ratios; measured contamination on nearby components; NaI gamma spectral measurements to compare uncontaminated and potentially-contaminated spectra, as well as measurements for the sentinel radionuclides, Am- 241 and Cs-137 on conducting wire; comparative statistical analyses; and in-situ measurements of alpha spectra on conducting wire showing that the alpha activity was natural Po-210, as well as to compare uncontaminated and potentially-contaminated spectra

  4. Statistical Bias Correction scheme for climate change impact assessment at a basin scale

    Science.gov (United States)

    Nyunt, C. T.

    2013-12-01

    Global climate models (GCMs) are the primary tool for understanding how the global climate may change in the future. GCM precipitation is characterized by underestimation of heavy precipitation, frequency errors by low intensity with long drizzle rain days and fail to catch the inter-seasonal change compared to the ground data. This study focus on the basin scale climate change impact study and we proposed the method for the multi model (GCMs) selection method together with the statistical bias correction method which cover the major deficiencies of GCM biases for climate change impact study at the basin level. The proposed method had been tested its applicability in the various river basin under different climate such as semiarid region in Tunisia, tropical monsoonal climate in Philippines and temperate humid region in Japan. It performed well enough for the climate change impact study in the basin scale and it can catch the point scale and basin scale climatology precipitation very well during the historical simulation. We found the GCM simulation during baiu season dissipate the baiu activity more earlier than the actual one when compared to the in-situ station data in Japan. For that case, the proposed bias correction performed in each season to reduce the bias of GCM for the impact study. The proposed bias correction method is still tested in different river basin in the world to check it applicability and now under developing as the web interface as the handy and efficient tool for the end users from the different parts of the world.

  5. Statistical assessment of fire safety in multi-residential buildings in Slovenia

    Directory of Open Access Journals (Sweden)

    Domen Kušar

    2009-01-01

    Full Text Available Nearly a third of residential units in Slovenia are located in multi-residential buildings. The majority of such buildings were built after WW2, when the need for suitable accommodation buildings was at its peak. They were built using the construction possibilities and requirements of the time. Every year there are over 200 fires in these buildings, resulting in fatalities and vast material damage. Due to the great efforts over the past centuries, which were all mainly aimed at replacing combustible construction materials with non-combustible ones, and with advancements in fire service equipment and techniques, the number of fires and their scope has decreased significantly but they were not entirely put out. New and greater advances in the field of fire safety of multi-residential buildings became obvious within the last few years, when stricter regulations regarding the construction of such objects came into force. Developments in science and within the industry itself brought about several new solutions in improving the situation in this field, which has been confirmed by experiences from abroad. Unfortunately in Slovenia, the establishment of safety principles still depends mainly on an occupants’ perception, financial means, and at the same time, certain implementation procedures that are much more complicated due to new property ownership. With the aid of the statistical results from the 2002 Census and contemporary fire safety requirements, this article attempts to show the present-day situation of the problem at both the state and municipality level and will propose solutions to improve this situation. The authors established that not even one single older, multi-residential building meets complies with modern requirements. Fortunately, the situation is improved by the fact that most buildings in Slovenia are built from non-combustible materials (concrete, brick, which limit the spread of fire.

  6. Economic assessment of polymer concrete usage in geothermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    1977-11-01

    Results of a study established to review the Heber and Niland, California 50 MWe conceptual geothermal power plants designs and to identify areas where non-metallic materials, such as polymer concrete, can be technically and economically employed are reported. Emphasis was directed toward determining potential economic advantages and resulting improvements in plant availability. It is estimated that use of polymer concrete in the Heber plant will effect a savings of 6.18 mills per KWH in the cost of power delivered to the network, a savings of 9.7%. A similar savings should be effected in the Niland plant.

  7. Hazard Identification, Risk Assessment and Risk Control (HIRARC Accidents at Power Plant

    Directory of Open Access Journals (Sweden)

    Ahmad Asmalia Che

    2016-01-01

    Full Text Available Power plant had a reputation of being one of the most hazardous workplace environments. Workers in the power plant face many safety risks due to the nature of the job. Although power plants are safer nowadays since the industry has urged the employer to improve their employees’ safety, the employees still stumble upon many hazards thus accidents at workplace. The aim of the present study is to investigate work related accidents at power plants based on HIRARC (Hazard Identification, Risk Assessment and Risk Control process. The data were collected at two coal-fired power plant located in Malaysia. The finding of the study identified hazards and assess risk relate to accidents occurred at the power plants. The finding of the study suggested the possible control measures and corrective actions to reduce or eliminate the risk that can be used by power plant in preventing accidents from occurred

  8. Revised Environmental Assessment for the Sacramento Area Office Western Area Power Administration, 1994 Power Marketing Plan

    International Nuclear Information System (INIS)

    This document presents information on power marketing; expiring contracts; economic methods and assumptions; detailed power supply cost data; guidelines and acceptance criteria for conservation and renewable energy projects; hourly flow impacts graphs; difference in hydro dispatch; generation data; flow data; fishery resources of the Sacramento River; and water quality

  9. Assessment of landslide risk using gis and statistical methods in kysuce region

    Directory of Open Access Journals (Sweden)

    Barančoková Mária

    2014-03-01

    Full Text Available The landslide susceptibility was assessed based on multivariation analysis. The input parameters were represented by lithology, land use, slope inclination and average annual precipitation. These parameters were evaluated as independent variables, and the existing landslides as dependent variables. The individual input parameters were reclassified and spatially adjusted. Spatial analysis resulted in 15 988 combinations of input parameters representing the homogeneous condition unit (HCU . Based on the landslide density within individual units, the HCU polygons have been classified according to landslide risk into stable, conditionally stable, conditionally stable and unstable (subdivided into low, medium and high landslide risk. A total of 2002 HCU s were affected by landslides, and the remaining 13 986 were not affected. The total HCU area affected by landslides is about 156.92 km2 (20.1%. Stable areas covered 623.01 km2 (79.8%, and conditionally stable areas covered 228.77 km2 (29.33% out of this area. Unstable areas were divided into three levels of landslide risk - low, medium and high risk. An area of 111.19 km2 (14.3% represents low landslide risk, medium risk 29.7 km2 (3.8% and 16.01 km2 (2% represents high risk. Since Zlín Formation lithological unit covers approximately one-third of the study area, it also influences the overall landslide risk assessment. This lithological formation covers the largest area within all landslide risk classes as well as in conditionally stable areas. The most frequent slope class was in the range of 14-19. The higher susceptibility of Zlín Formation to landslides is caused mainly by different geomorphological value of claystone and sandstone sequence. The higher share of claystone results in higher susceptibility of this formation to exogenous degradation processes.

  10. Quantitative assessments of burn degree by high-frequency ultrasonic backscattering and statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Yi-Hsun; Wang, Shyh-Hau [Department of Computer Science and Information Engineering, and Institute of Medical Informatics, National Cheng Kung University, No 1, University Road, Tainan City 70101, Taiwan (China); Huang, Chih-Chung, E-mail: shyhhau@mail.ncku.edu.tw [Department of Electrical Engineering, Fu Jen Catholic University, 510, Chung Cheng Rd, Hsin Chuang, Taipei County 24205, Taiwan (China)

    2011-02-07

    An accurate and quantitative modality to assess the burn degree is crucial for determining further treatments to be properly applied to burn injury patients. Ultrasounds with frequencies higher than 20 MHz have been applied to dermatological diagnosis due to its high resolution and noninvasive capability. Yet, it is still lacking a substantial means to sensitively correlate the burn degree and ultrasonic measurements quantitatively. Thus, a 50 MHz ultrasound system was developed and implemented to measure ultrasonic signals backscattered from the burned skin tissues. Various burn degrees were achieved by placing a 100 deg. C brass plate onto the dorsal skins of anesthetized rats for various durations ranged from 5 to 20 s. The burn degrees were correlated with ultrasonic parameters, including integrated backscatter (IB) and Nakagami parameter (m) calculated from ultrasonic signals acquired from the burned tissues of a 5 x 1.4 mm (width x depth) area. Results demonstrated that both IB and m decreased exponentially with the increase of burn degree. Specifically, an IB of -79.0 {+-} 2.4 (mean {+-} standard deviation) dB for normal skin tissues tended to decrease to -94.0 {+-} 1.3 dB for those burned for 20 s, while the corresponding Nakagami parameters tended to decrease from 0.76 {+-} 0.08 to 0.45 {+-} 0.04. The variation of both IB and m was partially associated with the change of properties of collagen fibers from the burned tissues verified by samples of tissue histological sections. Particularly, the m parameter may be more sensitive to differentiate burned skin due to the fact that it has a greater rate of change with respect to different burn durations. These ultrasonic parameters in conjunction with high-frequency B-mode and Nakagami images could have the potential to assess the burn degree quantitatively.

  11. Assessing attitudes towards statistics among medical students: psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS.

    Directory of Open Access Journals (Sweden)

    Dejana Stanisavljevic

    Full Text Available BACKGROUND: Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students' attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. METHODS: The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. RESULTS: Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8, and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort. Values for fit indices TLI (0.940 and CFI (0.961 were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051-0.078 was below the suggested value of ≤0.08. Cronbach's alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. CONCLUSION: Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students' attitudes towards statistics in the

  12. The european flood alert system EFAS – Part 2: Statistical skill assessment of probabilistic and deterministic operational forecasts

    Directory of Open Access Journals (Sweden)

    J. C. Bartholmes

    2009-02-01

    Full Text Available Since 2005 the European Flood Alert System (EFAS has been producing probabilistic hydrological forecasts in pre-operational mode at the Joint Research Centre (JRC of the European Commission. EFAS aims at increasing preparedness for floods in trans-national European river basins by providing medium-range deterministic and probabilistic flood forecasting information, from 3 to 10 days in advance, to national hydro-meteorological services.

    This paper is Part 2 of a study presenting the development and skill assessment of EFAS. In Part 1, the scientific approach adopted in the development of the system has been presented, as well as its basic principles and forecast products. In the present article, two years of existing operational EFAS forecasts are statistically assessed and the skill of EFAS forecasts is analysed with several skill scores. The analysis is based on the comparison of threshold exceedances between proxy-observed and forecasted discharges. Skill is assessed both with and without taking into account the persistence of the forecasted signal during consecutive forecasts.

    Skill assessment approaches are mostly adopted from meteorology and the analysis also compares probabilistic and deterministic aspects of EFAS. Furthermore, the utility of different skill scores is discussed and their strengths and shortcomings illustrated. The analysis shows the benefit of incorporating past forecasts in the probability analysis, for medium-range forecasts, which effectively increases the skill of the forecasts.

  13. Area Based Approach for Three Phase Power Quality Assessment in Clarke Plane

    Directory of Open Access Journals (Sweden)

    S. CHATTOPADHYAY

    2008-03-01

    Full Text Available This paper presents an area-based approach for electric power quality analysis. Some specific reference signals have been defined and areas formed by the real power system data with the reference signal have been calculated wherefrom contributions of fundamental waveform and harmonic components have been assessed separately. Active power, reactive power and total harmonic distortion factors have been measured. Clarke transformation technique has been used for analysis in three-phase system, which has reduced the computational effort to a great extent. Distortion factors of individual phase of a three-phase system have also been assessed.

  14. Self-Assessment at Krsko Nuclear Power Plant

    International Nuclear Information System (INIS)

    Self-assessment program in NPP Krsko is based on plant effort to identify areas for improvements, as well as strengths in various processes. The highest level tool of that program is Inter-disciplinary Self-assessment. Extensive experience in methodology from many Peer Reviews worldwide, where NPP Krsko personnel were involved, was essential contributor for successful development and implementation of Inter-disciplinary Self-assessment. Every Inter-disciplinary Self-assessment, performed by experienced NEK people, results in highly efficient and constructive action plan. It is achieved by professional approach and positive attitude of team leader and members. Typical team composition includes members from different NEK departments including their managers. They are experienced in area being assessed, as well as in Cause analysis techniques. People involved in previous Internal or Inter-discipline Self-assessments and international peer reviews are indispensable part of the team and usually team leader is one of them. Inter-disciplinary Self-assessments are planned well in advance and are approved by NEK management board. NEK directors are also involved through sponsorship. Often, they are counterparts in the interviews sessions of assessment. Methodology of carrying out Self-assessment is developed using WANO Peer reviews experience and techniques. Areas for assessment are mostly identified through Corrective action or trending processes, Internal self-assessments or Performance Indicators. Field observations, interviews with workers in the field and their superiors are reason for frequent team meetings. That process is often iterative and results in clear and precise observation reports which are separately analyzed and at the end confirmed by owner of the process. Based on analysis described in observation reports, team defines areas where generic problems are found. Team members are dedicated for particular areas, usually where they are more educated and

  15. An application and verification of ensemble forecasting on wind power to assess operational risk indicators in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, S.; Ciapessoni, E.; Cirio, D.; Pitto, A.; Sperati, S. [Ricerca sul Sistema Energetico RSE S.p.A., Milan (Italy). Power System Development Dept. and Environment and Sustainable Development Dept.; Pinson, P. [Technical University of Denmark, Lyngby (Denmark). DTU Informatics

    2012-07-01

    Wind energy is part of the so-called not schedulable renewable sources, i.e. it must be exploited when it is available, otherwise it is lost. In European regulation it has priority of dispatch over conventional generation, to maximize green energy production. However, being variable and uncertain, wind (and solar) generation raises several issues for the security of the power grids operation. In particular, Transmission System Operators (TSOs) need as accurate as possible forecasts. Nowadays a deterministic approach in wind power forecasting (WPF) could easily be considered insufficient to face the uncertainty associated to wind energy. In order to obtain information about the accuracy of a forecast and a reliable estimation of its uncertainty, probabilistic forecasting is becoming increasingly widespread. In this paper we investigate the performances of the COnsortium for Small-scale MOdelling Limited area Ensemble Prediction System (COSMO-LEPS). First the ensemble application is followed by assessment of its properties (i.e. consistency, reliability) using different verification indices and diagrams calculated on wind power. Then we provide examples of how EPS based wind power forecast can be used in power system security analyses. Quantifying the forecast uncertainty allows to determine more accurately the regulation reserve requirements, hence improving security of operation and reducing system costs. In particular, the paper also presents a probabilistic power flow (PPF) technique developed at RSE and aimed to evaluate the impact of wind power forecast accuracy on the probability of security violations in power systems. (orig.)

  16. Assessment of Statistical Methods Used in the Iranian Original Articles on Mammography

    Directory of Open Access Journals (Sweden)

    "A. Azizian

    2005-08-01

    Full Text Available Introduction & Background: One of the most impor-tant objectives in studies about diagnostic tests is to determine their validity. Two basic criteria of validity are “sensitivity” and “specificity”, but they depend on cut points used in determining “positive” and “nega-tive” test results. They also used to measure the accu-racy of a test. However, accuracy depends on the prevalence of disease in population. Receiver operat-ing characteristic (ROC curve is a plot of sensitivity versus its false positive rate for all possible cut points. Because sensitivity and specificity are independent of disease prevalence, the ROC curve will be independ-ent too. The area under curve (AUC is a good sum-mary measure of a test’s overall accuracy, because it does not depend on the prevalence of disease or the cut points used to form the curve. ROC analysis is the most accurate method used in mammographic stud-ies. Our experience showed that this method is not commonly a choice in our local journals. This article evaluates the proper use of this statistical method in our local published articles related to mammography. Patients & Methods: We searched Mammography keyword in “IranMedex” database which contains eighty one of Iranian journals. We found 26 related articles, three of them were case reports and ex-cluded. Remained articles were divided in two groups: A- articles which required ROC analysis; B- articles which did not require ROC analysis. We placed 10 articles in group A and evaluated for their methods. Results: In group A, 7 articles (70% had not used ROC analysis and applied only “sensitivity” and “specificity” or “accuracy” for one cut point in their method. Conclusions: As the results show, our local published articles related to mammography are not frequently using ROC analysis in their methods. As it previously mentioned, “sensitivity” and “specificity” are not the adequate measures in some studies and must

  17. Development of a New Safety Culture Assessment Method for Nuclear Power Plants (NPPs) (A study to suggest a new safety culture assessment method in nuclear power plants)

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2014-08-15

    This study is conducted to suggest a new safety culture assessment method in nuclear power plants. Criteria with various existing safety culture analysis methods are united, and reliability analysis methods are applied. The concept of the most representative methods, Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA), are adopted to assess safety culture. Through this application, it is expected that the suggested method will bring results with convenience and objectiveness.

  18. Two statistical models for long term seismic hazard assessment in Vrancea, Romania

    International Nuclear Information System (INIS)

    Intermediate-depth earthquakes have occurred frequently in Vrancea, Romania and caused severe damages. To understand the regularity of earthquake occurrence and to predict future earthquakes, we analyzed M ≥7.0 earthquakes during the period of 1500 - 2000 using earthquake catalogue ROMPLUS. Firstly, we have attempted to assess the long-term seismic hazards in Vrancea using a stress-release (SR) model which models the elastic rebound theory in a stochastic process. Renewal models were also applied to the same data set, but these did not perform as well as the SR-model. The SR-model has identified that the probability of an M≥7.0 earthquake occurring in Vrancea in a 5-year period exceeds 40% by the end of this decade. Secondly, we have proposed the periodic upward migration model, in which 1) the first M7 earthquake occurs at a deeper segment of the seismic region at the beginning of each century, 2) the second one occurs at a middle segment at the midst of each century, and 3) the third one occurs at a shallower segment at the end of each century. 4) The above activity repeats every century. We could demonstrate using AIC that this model is better than a uniform Poisson model in time and space. (authors)

  19. Integration of HIV in the Human Genome: Which Sites Are Preferential? A Genetic and Statistical Assessment

    Science.gov (United States)

    Gonçalves, Juliana; Moreira, Elsa; Sequeira, Inês J.; Rodrigues, António S.; Rueff, José; Brás, Aldina

    2016-01-01

    Chromosomal fragile sites (FSs) are loci where gaps and breaks may occur and are preferential integration targets for some viruses, for example, Hepatitis B, Epstein-Barr virus, HPV16, HPV18, and MLV vectors. However, the integration of the human immunodeficiency virus (HIV) in Giemsa bands and in FSs is not yet completely clear. This study aimed to assess the integration preferences of HIV in FSs and in Giemsa bands using an in silico study. HIV integration positions from Jurkat cells were used and two nonparametric tests were applied to compare HIV integration in dark versus light bands and in FS versus non-FS (NFSs). The results show that light bands are preferential targets for integration of HIV-1 in Jurkat cells and also that it integrates with equal intensity in FSs and in NFSs. The data indicates that HIV displays different preferences for FSs compared to other viruses. The aim was to develop and apply an approach to predict the conditions and constraints of HIV insertion in the human genome which seems to adequately complement empirical data. PMID:27294106

  20. Damage assessment for wind turbine blades based on a multivariate statistical approach

    Science.gov (United States)

    García, David; Tcherniak, Dmitri; Trendafilova, Irina

    2015-07-01

    This paper presents a vibration based structural health monitoring methodology for damage assessment on wind turbine blades made of composite laminates. Normally, wind turbine blades are manufactured by two half shells made by composite laminates which are glued together. This connection must be carefully controlled due to its high probability to disbond which might result in collapse of the whole structure. The delamination between both parts must be monitored not only for detection but also for localisation and severity determination. This investigation consists in a real time monitoring methodology which is based on singular spectrum analysis (SSA) for damage and delamination detection. SSA is able to decompose the vibratory response in a certain number of components based on their covariance distribution. These components, known as Principal Components (PCs), contain information about of the oscillatory patterns of the vibratory response. The PCs are used to create a new space where the data can be projected for better visualization and interpretation. The method suggested is applied herein for a wind turbine blade where the free-vibration responses were recorded and processed by the methodology. Damage for different scenarios viz different sizes and locations was introduced on the blade. The results demonstrate a clear damage detection and localization for all damage scenarios and for the different sizes.

  1. Impacts of distorted fishery statistical data on assessments of three surplus production models

    Institute of Scientific and Technical Information of China (English)

    WANG Yingbin; ZHENG Ji; WANG Zheng

    2011-01-01

    We evaluated the effect of various error sources in fishery harvest/effort data on the maximum sustainable yield (MSY) and corresponding fishing effort (EMSY) using Monte Carlo simulation analyses. A high coefficient of variation (CV) of the catch and effort values biased the estimates of MSY and EMSY. Thus, the state of the fisheries resource and its exploitation was overestimated. We compared the effect using three surplus production models, Hilborn-Waters (H-W), Schnute, and Prager models. The estimates generated using the H-W model were significantly affected by the CV. The Schnute model was least affected by errors in the underlying data. The CV of the catch data had a greater impact on the assessment than the CV of the fishing effort. Similarly, the changes in CV had a greater impact on the estimated maximum sustainable yield (MSY) than on the corresponding estimate of fishing effort (EMSY). We discuss the likely effect of these biases on management efforts and provide suggestions for the improvement of fishery evaluations.

  2. Statistical assessment of seafront and beach water quality of Mumbai, India.

    Science.gov (United States)

    Vijay, Ritesh; Kamble, Swapnil R; Dhage, S S; Sohony, R A; Wate, S R

    2011-01-01

    The water quality of seafronts and beaches of Mumbai is under pressure and deteriorating due to discharge of partially treated sewage and wastewater through point and nonpoint sources. The objective of the study was to assess the water quality and to correlate physico-chemical and bacteriological parameters for establishing relationship, association and dependence on each other. The water quality parameters were selected as per SW II standards specified by Central Pollution Control Board, India and nutrient parameters as strong indicators of sewage pollution. Box and whisker plots were generated for evaluating spatio temporal variation of water quality which suggest influence of organic pollution mostly at Mahim and Dadar in the form of outliers and extremes. Pearson's correlations were estimated between parameters and found significant correlation with each other indicating influence of sewage on water quality. The water quality of beaches and seafronts were found unsafe for recreational purposes. The study suggested that designated water quality can be achieved by restricting nonpoint source through improvement in wastewater collection systems, appropriate level of treatment and proper disposal.

  3. Validating Student Score Inferences with Person-Fit Statistic and Verbal Reports: A Person-Fit Study for Cognitive Diagnostic Assessment

    Science.gov (United States)

    Cui, Ying; Roberts, Mary Roduta

    2013-01-01

    The goal of this study was to investigate the usefulness of person-fit analysis in validating student score inferences in a cognitive diagnostic assessment. In this study, a two-stage procedure was used to evaluate person fit for a diagnostic test in the domain of statistical hypothesis testing. In the first stage, the person-fit statistic, the…

  4. Development and Application of On-line Wind Power Risk Assessment System

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Because of the large-scale integration of wind power, the dynamic characteristics of power system have many uncertain effects. Based on deterministic analysis methods, traditional on-line security assessment system cannot quantitatively estimate the actual operating conditions of the power system for only considering the most serious and credible accidents. Therefore, the risk theory is introduced into an on-line security assessment system and then an on-line risk assessment dynamic security assessment system system for wind power is designed Based on multiple data integration, and implemented by combining with the the wind power disturbance probability is available and the security assessment of the power grid can obtain security indices in different aspects. The operating risk index is an expectation of severity, computed by summing up all the products of the result probability and its severity. Analysis results are reported to the dispatchers in on-line environment, while the comprehensive weak links are automatically provided to the power dispatching center. The risk assessment system in operation can verify the reasonableness of the system.

  5. Intraclass reliability for assessing how well Taiwan constrained hospital-provided medical services using statistical process control chart techniques

    Directory of Open Access Journals (Sweden)

    Chien Tsair-Wei

    2012-05-01

    Full Text Available Abstract Background Few studies discuss the indicators used to assess the effect on cost containment in healthcare across hospitals in a single-payer national healthcare system with constrained medical resources. We present the intraclass correlation coefficient (ICC to assess how well Taiwan constrained hospital-provided medical services in such a system. Methods A custom Excel-VBA routine to record the distances of standard deviations (SDs from the central line (the mean over the previous 12 months of a control chart was used to construct and scale annual medical expenditures sequentially from 2000 to 2009 for 421 hospitals in Taiwan to generate the ICC. The ICC was then used to evaluate Taiwan’s year-based convergent power to remain unchanged in hospital-provided constrained medical services. A bubble chart of SDs for a specific month was generated to present the effects of using control charts in a national healthcare system. Results ICCs were generated for Taiwan’s year-based convergent power to constrain its medical services from 2000 to 2009. All hospital groups showed a gradually well-controlled supply of services that decreased from 0.772 to 0.415. The bubble chart identified outlier hospitals that required investigation of possible excessive reimbursements in a specific time period. Conclusion We recommend using the ICC to annually assess a nation’s year-based convergent power to constrain medical services across hospitals. Using sequential control charts to regularly monitor hospital reimbursements is required to achieve financial control in a single-payer nationwide healthcare system.

  6. Tract-based spatial statistics to assess the neuroprotective effect of early erythropoietin on white matter development in preterm infants.

    Science.gov (United States)

    O'Gorman, Ruth L; Bucher, Hans U; Held, Ulrike; Koller, Brigitte M; Hüppi, Petra S; Hagmann, Cornelia F

    2015-02-01

    Despite improved survival, many preterm infants undergo subsequent neurodevelopmental impairment. To date, no neuroprotective therapies have been implemented into clinical practice. Erythropoietin, a haematopoietic cytokine used for treatment of anaemia of prematurity, has been shown to have neuroprotective and neuroregenerative effects on the brain in many experimental studies. The aim of the study was to assess the effect of recombinant human erythropoietin on the microstructural development of the cerebral white matter using tract-based spatial statistics performed at term equivalent age. A randomized, double-blind placebo-controlled, prospective multicentre study applying recombinant human erythropoietin in the first 42 h after preterm birth entitled 'Does erythropoietin improve outcome in preterm infant' was conducted in Switzerland (NCT00413946). Preterm infants were given recombinant human erythropoietin (3000 IU) or an equivalent volume of placebo (NaCl 0.9%) intravenously before 3 h of age after birth, at 12-18 h and at 36-42 h after birth. High resolution diffusion tensor imaging was obtained at 3 T in 58 preterm infants with mean (standard deviation) gestational age at birth 29.75 (1.44) weeks, and at scanning at 41.1 (2.09) weeks. Imaging was performed at a single centre. Voxel-wise statistical analysis of the fractional anisotropy data was carried out using tract-based spatial statistics to test for differences in fractional anisotropy between infants treated with recombinant human erythropoietin and placebo using a general linear model, covarying for the gestational age at birth and the corrected gestational age at the time of the scan. Preterm infants treated with recombinant human erythropoietin demonstrated increased fractional anisotropy in the genu and splenium of the corpus callosum, the anterior and posterior limbs of the internal capsule, and the corticospinal tract bilaterally. Mean fractional anisotropy was significantly higher in preterm

  7. Urban heat island by means of city clusters: a statistical assessment of size influence and seasonality

    Science.gov (United States)

    Zhou, Bin; Rybski, Diego; Kropp, Jürgen P.

    2014-05-01

    In the last decades, influence factors of the Urban Heat Island (UHI) effect have been intensively investigated and further broadened through a variety of studies around the world. Briefly, compared to non-built surroundings, built-up areas of cities differ considerably in albedo, thermal capacity, roughness, etc. which can significantly modify the surface energy budget and make downtown areas of cities hotter than their vicinities. Most previous studies were built upon a limited number of cities, and suffered from inconsistency and instability with regard to the urban-rural definition, which hinders the inter-comparison between results. To overcome this limitation in the number of considered cities, we perform a systematic study of all cities in Europe to assess the Surface Urban Heat Island (SUHI) intensity by means of land surface temperature data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. After defining cities as spatial clusters of urban land cover based on CORINE data, we determine a boundary around the urban cluster of approximately equal area to the cluster area. SUHI intensity is thus defined as the difference between the mean temperature in the cluster and that of the surroundings. We investigate the relationships of the SUHI intensity, respectively with the cluster size and with the temperature of the surroundings. Our results show that in Europe, the SUHI intensity in summer has a strong correlation with the cluster size, which can be well fitted by an empirical sigmoid model. Furthermore, we find a pronounced seasonality of the SUHI intensity for individual clusters in the form of hysteresis-like curves. Characterizing the shape by means of Fourier series approximation and consequential work of clustering, we identify apparent regional patterns which suggest a climatological basis for the heterogeneity of UHI.

  8. Green power marketing in retail competition: an early assessment

    International Nuclear Information System (INIS)

    With retail competition being introduced throughout the United States, green power marketing offers the promise of customer-driven markets for renewable energy. This paper summarizes early experience with green marketing under full retail competition. We conclude that (1) niche markets exist today among residential and non-residential consumers for green power; (2) green demand may ultimately offer an important strategic market for renewable technologies, but the market is currently rather small and the long-term prospects remain uncertain; (3) the success of green markets will depend critically on the regulatory rules established at the onset of restructuring; and (4) the biomass industry will be forced to better communicate the environmental benefits of its technology in order to play a strong role within the green market. This paper is based on a more detailed NREL Topical Issues Brief, which is available on the Internet. (author)

  9. Assessment of seismic damages in nuclear power plant buildings

    International Nuclear Information System (INIS)

    Performance of nuclear power plant sites, buildings and components is in today's practice continuously evaluated by means of monitoring systems composed by a variety of instruments, allowing records of the most significant behavioral parameters to be gathered by electronic data acquisition equipment. A great emphasis has been devoted in recent years to the development of ''intelligent'' monitoring systems able to perform interpretation of the response of structures and components automatically, only requiring human intervention and sophisticated data processing techniques when degradation of the safety margins is likely to have been produced. Such computerized procedures can be formulated through logic or algorithmic processes and normally are consistently based upon simplified, heuristic behavioral models and probabilistic reasoning schemes. This paper is devoted to discuss the development of an algorithmic procedure intended for automatic, real-time interpretation of the recorded response of nuclear power plant buildings and foundations during seismic events

  10. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study.

    Directory of Open Access Journals (Sweden)

    Benjamin F Mentiplay

    Full Text Available Hand-held dynamometry (HHD has never previously been used to examine isometric muscle power. Rate of force development (RFD is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power.30 healthy young adults (age: 23±5 yrs, male: 15 were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2 and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC, Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change.Comparison of RFD methods revealed that a peak 200 ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70 for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70 for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31-0.79.Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in

  11. Assessment of the thorium fuel cycle in power reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kasten, P.R.; Homan, F.J.; Allen, E.J.

    1977-01-01

    A study was conducted at Oak Ridge National Laboratory to evaluate the role of thorium fuel cycles in power reactors. Three thermal reactor systems were considered: Light Water Reactors (LWRs); High-Temperature Gas-Cooled Reactors (HTGRs); and Heavy Water Reactors (HWRs) of the Canadian Deuterium Uranium Reactor (CANDU) type; most of the effort was on these systems. A summary comparing thorium and uranium fuel cycles in Fast Breeder Reactors (FBRs) was also compiled.

  12. Assessment of Microbial Fuel Cell Configurations and Power Densities

    KAUST Repository

    Logan, Bruce E.

    2015-07-30

    Different microbial electrochemical technologies are being developed for a many diverse applications, including wastewater treatment, biofuel production, water desalination, remote power sources, and as biosensors. Current and energy densities will always be limited relative to batteries and chemical fuel cells, but these technologies have other advantages based on the self-sustaining nature of the microorganisms that can donate or accept electrons from an electrode, the range of fuels that can be used, and versatility in the chemicals that can be produced. The high cost of membranes will likely limit applications of microbial electrochemical technologies that might require a membrane. For microbial fuel cells, which do not need a membrane, questions remain on whether larger-scale systems can produce power densities similar to those obtained in laboratory-scale systems. It is shown here that configuration and fuel (pure chemicals in laboratory media versus actual wastewaters) remain the key factors in power production, rather than the scale of the application. Systems must be scaled up through careful consideration of electrode spacing and packing per unit volume of reactor.

  13. Assessment of occupational radiation protection conditions during power enhancement of the Angra-2 nuclear power plant

    International Nuclear Information System (INIS)

    This paper intend to analyse the occupational radioprotection conditions of the Angra-2 nuclear power plant, from the startup up to reach 100% of the nominal power. To perform this work a group of dose rates measures was made including beta/gamma and neutron radiation, particulates and iodine monitoring, and surface contamination, during the whole process. These measures were made inside of the three main buildings: the reactor buildings (UJA - reactor core and UJB) and the Reactor Auxiliary Building (UKA). (author)

  14. Dynamic Security Assessment of Danish Power System Based on Decision Trees: Today and Tomorrow

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Liu, Leo; Chen, Zhe;

    2013-01-01

    The research work presented in this paper analyzes the impact of wind energy, phasing out of central power plants and cross border power exchange on dynamic security of Danish Power System. Contingency based decision tree (DT) approach is used to assess the dynamic security of present and future...... in DIgSILENT PowerFactory environment and applied to western Danish Power System which is passing through a phase of major transformation. The results have shown that phasing out of central power plants coupled with large scale wind energy integration and more dependence on international ties can have...... Danish Power System. Results from offline time domain simulation for large number of possible operating conditions (OC) and critical contingencies are organized to build up the database, which is then used to predict the security of present and future power system. The mentioned approach is implemented...

  15. Insights into shutdown probabilistic risk assessment for WWER-1000 nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Sabet, Hamid; Hadavi, M.H. [Nuclear Reactor Engineering, Shiraz U., Shiraz (Iran, Islamic Republic of)

    2006-07-01

    Full text of publication follows: Applying the probabilistic risk assessment as a powerful tool to nuclear power plants had became a non-separable part of plants safety/risk analysis. Although, the previous regulations and safety assessments of nuclear power plants were, traditionally concentrated on full power operation, but now because of the events that potentially may occur during low power and shutdown mode of the plants, these operational modes become more important. In this research, it is attempted to concentrate on initiating events that may occur during low power and shutdown mode of WWER-1000 and the accident scenarios and safety functions along with front-line systems discussed. Eventually, event trees for the most important initiating event groups and fault trees are developed. (authors)

  16. Implications of sustainability assessment for electricity system design: The case of the Ontario Power Authority's integrated power system plan

    International Nuclear Information System (INIS)

    This paper explores the results and implications of an illustrative application of a sustainability assessment framework in the design and evaluation of a major integrated power system plan. The paper examines the integrated power system plan developed by the Ontario Power Authority in 2007. The basic framework rests on a generic set of evaluation criteria reflecting basic requirements for progress towards sustainability that was adopted, reinterpreted and applied by the Authority in support of its proposed plan. In response to evident deficiencies in the Authority's work, the authors and colleagues undertook a re-examination using a more fully elaborated sustainability assessment framework, specified for application to power system planning. The results point to a plan and plan components substantially different from those proposed by the Authority. More generally, the results highlight three advantages of applying such a sustainability assessment framework: comprehensive coverage of key requirements for progress towards sustainability while ensuring careful attention to the context and concerns of the sector; emphasis on identifying plan options that avoid major trade-offs among the sustainability criteria and recognition of interactions among the social, ecological, economic and technological realms favouring options that offer multiple, mutually reinforcing and lasting benefits.

  17. Self-reported gait unsteadiness in mildly impaired neurological patients: an objective assessment through statistical gait analysis

    Directory of Open Access Journals (Sweden)

    Benedetti Maria

    2012-08-01

    Full Text Available Abstract Background Self-reported gait unsteadiness is often a problem in neurological patients without any clinical evidence of ataxia, because it leads to reduced activity and limitations in function. However, in the literature there are only a few papers that address this disorder. The aim of this study is to identify objectively subclinical abnormal gait strategies in these patients. Methods Eleven patients affected by self-reported unsteadiness during gait (4 TBI and 7 MS and ten healthy subjects underwent gait analysis while walking back and forth on a 15-m long corridor. Time-distance parameters, ankle sagittal motion, and muscular activity during gait were acquired by a wearable gait analysis system (Step32, DemItalia, Italy on a high number of successive strides in the same walk and statistically processed. Both self-selected gait speed and high speed were tested under relatively unconstrained conditions. Non-parametric statistical analysis (Mann–Whitney, Wilcoxon tests was carried out on the means of the data of the two examined groups. Results The main findings, with data adjusted for velocity of progression, show that increased double support and reduced velocity of progression are the main parameters to discriminate patients with self-reported unsteadiness from healthy controls. Muscular intervals of activation showed a significant increase in the activity duration of the Rectus Femoris and Tibialis Anterior in patients with respect to the control group at high speed. Conclusions Patients with a subjective sensation of instability, not clinically documented, walk with altered strategies, especially at high gait speed. This is thought to depend on the mechanisms of postural control and coordination. The gait anomalies detected might explain the symptoms reported by the patients and allow for a more focused treatment design. The wearable gait analysis system used for long distance statistical walking assessment was able to detect

  18. Assessing the Discriminating Power of Item and Test Scores in the Linear Factor-Analysis Model

    Science.gov (United States)

    Ferrando, Pere J.

    2012-01-01

    Model-based attempts to rigorously study the broad and imprecise concept of "discriminating power" are scarce, and generally limited to nonlinear models for binary responses. This paper proposes a comprehensive framework for assessing the discriminating power of item and test scores which are analyzed or obtained using Spearman's factor-analytic…

  19. 75 FR 8153 - Nebraska Public Power District; Cooper Nuclear Station Environmental Assessment and Finding of No...

    Science.gov (United States)

    2010-02-23

    ... March 27, 2009 (74 FR 13926). There will be no change to radioactive effluents that affect radiation... impact [Part 73, Power Reactor Security Requirements, 74 FR 13926 (March 27, 2009)]. The NRC staff's... COMMISSION Nebraska Public Power District; Cooper Nuclear Station Environmental Assessment and Finding of...

  20. 75 FR 42790 - Exelon Generation Company, LLC; Clinton Power Station; Environmental Assessment and Finding of No...

    Science.gov (United States)

    2010-07-22

    ... COMMISSION Exelon Generation Company, LLC; Clinton Power Station; Environmental Assessment and Finding of No...-62, issued to Exelon Generation Company, LLC (the licensee), for operation of the Clinton Power... have access to ADAMS or who encounter problems in accessing the documents located in ADAMS...