WorldWideScience

Sample records for assessment statistical power

  1. Assessment and statistics of Brazilian hydroelectric power plants: Dam areas versus installed and firm power

    International Nuclear Information System (INIS)

    The Brazilian relief, predominantly composed by small mountains and plateaus, contributed to formation of rivers with high amount of falls. With exception to North-eastern Brazil, the climate of this country are rainy, which contributes to maintain water flows high. These elements are essential to a high hydroelectric potential, contributing to the choice of hydroelectric power plants as the main technology of electricity generation in Brazil. Though this is a renewable source, whose utilized resource is free, dams must to be established which generates a high environmental and social impact. The objective of this study is to evaluate the impact caused by these dams through the use of environmental indexes. These indexes are ratio formed by installed power with dam area of a hydro power plant, and ratio formed by firm power with this dam area. In this study, the greatest media values were found in South, Southeast, and Northeast regions respectively, and the smallest media values were found in North and Mid-West regions, respectively. The greatest encountered media indexes were also found in dams established in the 1950s. In the last six decades, the smallest indexes were registered by dams established in the 1980s. These indexes could be utilized as important instruments for environmental impact assessments, and could enable a dam to be established that depletes an ecosystem as less as possible. (author)

  2. Which soil carbon characteristic is the best for assessing management differences? View from statistical power perspective

    Science.gov (United States)

    Ladoni, Moslem; Kravchenko, Sasha

    2014-05-01

    Conservational agricultural managements have a potential to increase soil organic carbon sequestration. However, due to typically slow response of soil organic C to management and due to its large spatial variability many researchers find themselves failing to detect statistically significant management effects on soil organic carbon in their studies. One solution that has been commonly applied is to use active fractions of soil organic C for treatment comparisons. Active pools of soil organic C have been shown to respond to management changes faster than total C; however, it is possible that larger variability associated with these pools can make their use for treatment comparisons more difficult. The objectives of this study are to assess the variability of total C and C active pools and then to use power analysis to investigate the probability of detecting significant differences among the treatments for total C and for different active pools of C. We also explored the benefit of applying additional soil and landscape data as covariates to explain some of the variability and to enhance the statistical power for different pools of C. We collected 66 soil from 10 agricultural fields under three different management treatments, namely corn-soybean-wheat rotation systems with 1) conventional chemical inputs, 2) low chemical inputs with cover crops and 3) organic management with cover crops. The cores were analyzed for total organic carbon (TOC) and for two active C pool characteristics, such as particulate organic carbon (POC) and short-term mineralizable carbon (SMC). In addition, for each core we determined the values of potential covariates including soil particle size distribution, bulk density and topographical terrain attributes. Power analysis was conducted using the estimates of variances from the obtained data and a series of hypothesized management effects. The range of considered hypothesized effects consisted of 10-100% increases under low-input, 10

  3. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Directory of Open Access Journals (Sweden)

    Frantál Bohumil

    2016-03-01

    Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.

  4. Power generation statistics

    International Nuclear Information System (INIS)

    The frost in February increased the power demand in Finland significantly. The total power consumption in Finland during January-February 2001 was about 4% higher than a year before. In January 2001 the average temperature in Finland was only about - 4 deg C, which is nearly 2 degrees higher than in 2000 and about 6 degrees higher than long term average. Power demand in January was slightly less than 7.9 TWh, being about 0.5% less than in 2000. The power consumption in Finland during the past 12 months exceeded 79.3 TWh, which is less than 2% higher than during the previous 12 months. In February 2001 the average temperature was - 10 deg C, which was about 5 degrees lower than in February 2000. Because of this the power consumption in February 2001 increased by 5%. Power consumption in February was 7.5 TWh. The maximum hourly output of power plants in Finland was 13310 MW. Power consumption of Finnish households in February 2001 was about 10% higher than in February 2000, and in industry the increase was nearly zero. The utilization rate in forest industry in February 2001 decreased from the value of February 2000 by 5%, being only about 89%. The power consumption of the past 12 months (Feb. 2000 - Feb. 2001) was 79.6 TWh. Generation of hydroelectric power in Finland during January - February 2001 was 10% higher than a year before. The generation of hydroelectric power in Jan. - Feb. 2001 was nearly 2.7 TWh, corresponding to 17% of the power demand in Finland. The output of hydroelectric power in Finland during the past 12 months was 14.7 TWh. The increase from the previous 12 months was 17% corresponding to over 18% of the power demand in Finland. Wind power generation in Jan. - Feb. 2001 was exceeded slightly 10 GWh, while in 2000 the corresponding output was 20 GWh. The degree of utilization of Finnish nuclear power plants in Jan. - Feb. 2001 was high. The output of these plants was 3.8 TWh, being about 1% less than in Jan. - Feb. 2000. The main cause for the

  5. DISTRIBUTED GRID-CONNECTED PHOTOVOLTAIC POWER SYSTEM EMISSION OFFSET ASSESSMENT: STATISTICAL TEST OF SIMULATED- AND MEASURED-BASED DATA

    Science.gov (United States)

    This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...

  6. NONSTRUCTURAL AND STATISTICAL NONPARAMETRIC MARKET POWER TESTS: AN EMPIRICAL INVESTIGATION

    OpenAIRE

    Noelke, Corinna M.; Raper, Kellie Curry

    1999-01-01

    We use Monte Carlo experiments to assess the accuracy of two nonstructural and two statistical nonparametric market power tests. We implement these monopoly and monopsony market power tests using data from ten known market structures. The objective is to determine which test is most able to distinguish between market structures. The statistical nonparametric market power tests appear to be promising.

  7. Assessing statistical significance of periodogram peaks

    OpenAIRE

    Baluev, Roman V.

    2007-01-01

    The least-squares (or Lomb-Scargle) periodogram is a powerful tool which is used routinely in many branches of astronomy to search for periodicities in observational data. The problem of assessing statistical significance of candidate periodicities for different periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. They include an upper limit to the false alarm probability (or a lower limit to the significan...

  8. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation stu

  9. Assessing statistical significance of periodogram peaks

    CERN Document Server

    Baluev, Roman V

    2007-01-01

    The least-squares (or Lomb-Scargle) periodogram is a powerful tool which is used routinely in many branches of astronomy to search for periodicities in observational data. The problem of assessing statistical significance of candidate periodicities for different periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. They include an upper limit to the false alarm probability (or a lower limit to the significance). These estimations are tested numerically in order to establish regions of their practical applicability.

  10. Assessing the statistical significance of periodogram peaks

    Science.gov (United States)

    Baluev, R. V.

    2008-04-01

    The least-squares (or Lomb-Scargle) periodogram is a powerful tool that is routinely used in many branches of astronomy to search for periodicities in observational data. The problem of assessing the statistical significance of candidate periodicities for a number of periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. These include an upper limit to the false alarm probability (or a lower limit to the significance). The estimations are tested numerically in order to establish regions of their practical applicability.

  11. Calculating statistical power in Mendelian randomization studies.

    Science.gov (United States)

    Brion, Marie-Jo A; Shakhbazov, Konstantin; Visscher, Peter M

    2013-10-01

    In Mendelian randomization (MR) studies, where genetic variants are used as proxy measures for an exposure trait of interest, obtaining adequate statistical power is frequently a concern due to the small amount of variation in a phenotypic trait that is typically explained by genetic variants. A range of power estimates based on simulations and specific parameters for two-stage least squares (2SLS) MR analyses based on continuous variables has previously been published. However there are presently no specific equations or software tools one can implement for calculating power of a given MR study. Using asymptotic theory, we show that in the case of continuous variables and a single instrument, for example a single-nucleotide polymorphism (SNP) or multiple SNP predictor, statistical power for a fixed sample size is a function of two parameters: the proportion of variation in the exposure variable explained by the genetic predictor and the true causal association between the exposure and outcome variable. We demonstrate that power for 2SLS MR can be derived using the non-centrality parameter (NCP) of the statistical test that is employed to test whether the 2SLS regression coefficient is zero. We show that the previously published power estimates from simulations can be represented theoretically using this NCP-based approach, with similar estimates observed when the simulation-based estimates are compared with our NCP-based approach. General equations for calculating statistical power for 2SLS MR using the NCP are provided in this note, and we implement the calculations in a web-based application. PMID:24159078

  12. Statistics review 11: Assessing risk

    OpenAIRE

    Bewick, Viv; Cheek, Liz; Ball, Jonathan

    2004-01-01

    Relative risk and odds ratio have been introduced in earlier reviews (see Statistics reviews 3, 6 and 8). This review describes the calculation and interpretation of their confidence intervals. The different circumstances in which the use of either the relative risk or odds ratio is appropriate and their relative merits are discussed. A method of measuring the impact of exposure to a risk factor is introduced. Measures of the success of a treatment using data from clinical trials are also con...

  13. Power performance assessment. Final report

    International Nuclear Information System (INIS)

    In the increasingly commercialised wind power marketplace, the lack of precise assessment methods for the output of an investment is becoming a barrier for wider penetration of wind power. Thus, addressing this problem, the overall objectives of the project are to reduce the financial risk in investment in wind power projects by significantly improving the power performance assessment methods. Ultimately, if this objective is successfully met, the project may also result in improved tuning of the individual wind turbines and in optimisation methods for wind farm operation. The immediate, measurable objectives of the project are: To prepare a review of existing contractual aspects of power performance verification procedures of wind farms; to provide information on production sensitivity to specific terrain characteristics and wind turbine parameters by analyses of a larger number of wind farm power performance data available to the proposers; to improve the understanding of the physical parameters connected to power performance in complex environment by comparing real-life wind farm power performance data with 3D computational flow models and 3D-turbulence wind turbine models; to develop the statistical framework including uncertainty analysis for power performance assessment in complex environments; and to propose one or more procedures for power performance evaluation of wind power plants in complex environments to be applied in contractual agreements between purchasers and manufacturers on production warranties. Although the focus in this project is on power performance assessment the possible results will also be of benefit to energy yield forecasting, since the two tasks are strongly related. (au) JOULE III. 66 refs.; In Co-operation Renewable Energy System Ltd. (GB); Centre for Renewable Energy (GR); Aeronautic Research Centre (SE); National Engineering Lab. (GB); Public Power Cooperation (GR)

  14. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  15. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1991 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping af units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power'', September 1977, have been applied. (au)

  16. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1988 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  17. Power quality assessment

    International Nuclear Information System (INIS)

    The electrical power systems are exposed to different types of power quality disturbances problems. Assessment of power quality is necessary for maintaining accurate operation of sensitive equipment's especially for nuclear installations, it also ensures that unnecessary energy losses in a power system are kept at a minimum which lead to more profits. With advanced in technology growing of industrial / commercial facilities in many region. Power quality problems have been a major concern among engineers; particularly in an industrial environment, where there are many large-scale type of equipment. Thus, it would be useful to investigate and mitigate the power quality problems. Assessment of Power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. The choice of monitoring equipment in a survey is also important to ascertain a solution to these power quality problems. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards); then, if problems exist, recommendation of mitigation techniques must be considered. The main objective of the present work is to investigate and mitigate of power quality problems in nuclear installations. Normally electrical power is supplied to the installations via two sources to keep good reliability. Each source is designed to carry the full load. The Assessment of power quality was performed at the nuclear installations for both sources at different operation conditions. The thesis begins with a discussion of power quality definitions and the results of previous studies in power quality monitoring. The assessment determines that one source of electricity was deemed to have relatively good power quality; there were several disturbances, which exceeded the thresholds. Among of them are fifth harmonic, voltage swell, overvoltage and flicker. While the second

  18. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)

  19. Power and environmental assessment

    DEFF Research Database (Denmark)

    Cashmore, Matthew Asa; Richardson, Tim

    2013-01-01

    The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently...

  20. Object discrimination reversal as a method to assess cognitive impairment in nonhuman primate enhanced pre- and postnatal developmental (ePPND) studies: statistical power analysis.

    Science.gov (United States)

    Cappon, Gregg D; Bowman, Christopher J; Hurtt, Mark E; Grantham, Lonnie E

    2012-10-01

    An important aspect of the enhanced pre- and postnatal developmental (ePPND) toxicity study in nonhuman primates (NHP) is that it combines in utero and postnatal assessments in a single study. However, it is unclear if NHP ePPND studies are suitable to perform all of the evaluations incorporated into rodent PPND studies. To understand the value of including cognitive assessment in a NHP ePPND toxicity study, we performed a power analysis of object discrimination reversal task data using a modified Wisconsin General Testing Apparatus (ODR-WGTA) from two NHP ePPND studies. ODR-WGTA endpoints evaluated were days to learning and to first reversal, and number of reversals. With α = 0.05 and a one-sided t-test, a sample of seven provided 80% power to predict a 100% increase in all three of the ODR-WGTA endpoints; a sample of 25 provided 80% power to predict a 50% increase. Similar power analyses were performed with data from the Cincinnati Water Maze (CWM) and passive avoidance tests from three rat PPND toxicity studies. Groups of 5 and 15 in the CWM and passive avoidance test, respectively, provided 80% power to detect a 100% change. While the power of the CWM is not far superior to the NHP ODR-WGTA, a clear advantage is the routine use of larger sample size, with a group of 20 rats the CWM provides ~90% power to detect a 50% change. Due to the limitations on the number of animals, the ODR-WGTA may not be suitable for assessing cognitive impairment in NHP ePPND studies. PMID:22930561

  1. Availability statistics for thermal power plants 1992

    International Nuclear Information System (INIS)

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power'', September 1977, have been applied. (au)

  2. Statistical Performances of Resistive Active Power Splitter

    Science.gov (United States)

    Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul

    2016-03-01

    In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.

  3. Evaluating and Reporting Statistical Power in Counseling Research

    Science.gov (United States)

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  4. Practical Uses of Statistical Power in Business Research Studies.

    Science.gov (United States)

    Markowski, Edward P.; Markowski, Carol A.

    1999-01-01

    Proposes the use of statistical power subsequent to the results of hypothesis testing in business research. Describes how posttest use of power might be integrated into business statistics courses. (SK)

  5. Statistical aspects of fish stock assessment

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte

    Fish stock assessments are conducted for two main purposes: 1) To estimate past and present fish abundances and their commercial exploitation rates. 2) To predict the consequences of different management strategies in order to ensure a sustainable fishery in the future. This thesis concerns...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods for...... stock assessment by application of state-of-the-art statistical methodology. The main contributions are presented in the form of six research papers. The major part of the thesis deals with age-structured assessment models, which is the most common approach. Conversion from length to age distributions...

  6. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  7. When Mathematics and Statistics Collide in Assessment Tasks

    Science.gov (United States)

    Bargagliotti, Anna; Groth, Randall

    2016-01-01

    Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.

  8. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials as...... shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  9. The Role of Atmospheric Measurements in Wind Power Statistical Models

    Science.gov (United States)

    Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.

    2015-12-01

    The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.

  10. Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models

    Science.gov (United States)

    Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles

    2012-01-01

    This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…

  11. The Power and Robustness of Maximum LOD Score Statistics

    OpenAIRE

    YOO, Y. J.; MENDELL, N.R.

    2008-01-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value.

  12. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  13. Editor’s note: The uncorrupted statistical power

    Directory of Open Access Journals (Sweden)

    Jean Descôteaux

    2007-09-01

    Full Text Available In 1999, Wilkinson and the Task Force on Statistical Inference published a number of recommendations concerning testing – related issues including, most importantly, statistical power. These recommendations are discussed prior to the presentation of the structure and the various articles of this special issue on statistical power. The contents of these articles will most certainly prove quite useful to those wishing to follow the Task Force’s recommendations.

  14. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  15. Statistical tests for power-law cross-correlated processes.

    Science.gov (United States)

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρ(DCCA)(T,n), where T is the total length of the time series and n the window size. For ρ(DCCA)(T,n), we numerically calculated the Cauchy inequality -1 ≤ ρ(DCCA)(T,n) ≤ 1. Here we derive -1 ≤ ρ DCCA)(T,n) ≤ 1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρ(DCCA) within which the cross-correlations become statistically significant. For overlapping windows we numerically determine-and for nonoverlapping windows we derive--that the standard deviation of ρ(DCCA)(T,n) tends with increasing T to 1/T. Using ρ(DCCA)(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series. PMID:22304166

  16. Statistical tests for power-law cross-correlated processes

    Science.gov (United States)

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.

  17. Statistical aspects of environmental risk assessment of GM plants for effects on non-target organisms

    Science.gov (United States)

    Previous European guidance for environmental risk assessment of genetically-modified plants emphasized the concepts of statistical power but provided no explicit requirements for the provision of statistical power analyses. Similarly, whilst the need for good experimental designs was stressed, no m...

  18. IAEA releases nuclear power statistics for 2002

    International Nuclear Information System (INIS)

    Full text: A total of 441 nuclear power plants were operating around the world at the end of 2002, according to data reported to the IAEA's Power Reactor Information System (PRIS). World nuclear electricity generation was about 2574 TWh. Also during 2002, six nuclear power plants representing 5013 MW(e) were connected to the grid, four in China, one in the Czech Republic and one in the Republic of Korea. In addition, construction of seven new nuclear reactors commenced in 2002 - six in India and one in the Democratic People's Republic of Korea, bringing the total number of nuclear reactors reported as being under construction to 32. Four nuclear reactors were shut down in 2002, two in Bulgaria and two in the United Kingdom. The ten countries with the highest reliance on nuclear power in 2002 were: Lithuania, 80.1 per cent; France, 78 per cent; Slovakia, 65.4 per cent; Belgium 57.3 per cent; Bulgaria, 47.3 per cent; Ukraine, 45.7 per cent; Sweden, 45.7 per cent; Slovenia, 40.7 per cent; Armenia, 40.5 per cent; Switzerland 39.5 per cent. During 2002, six new nuclear power plants were connected to the electricity grid: Qinshan 2-1, a 610 MW(e) pressurized water reactor (PWR) in China; Qinshan 3-1, a 655 MW(e) pressurized heavy water reactor (PHWR) in China; Lingao 1, a 938 MW(e) PWR in China; Lingao 2, a 938 MW(e) PWR in China; Temelin 2, a 912 MW(e) water-cooled and water- moderated reactor (WWER) in Czech Republic; Yonggwang 6, a 950 MW(e) PWR in Republic of Korea. Also, in 2002 construction started on seven plants: Kaiga 3, a 202 MW(e) PHWR in India; Kaiga 4, a 202 MW(e) PHWR in India; Rajasthan 5, a 202 MW(e) PHWR in India; Rajasthan 6, a 202 MW(e) PHWR in India Kudankulam 1, a 905 MW(e) WWER in India; Kudankulam 2, a 905 MW(e) WWER in India; LWR - Project Unit 1, a 1040 MW(e) PWR in Dem. P. R. Korea. A table showing nuclear power reactors in operation and under construction at 31 Dec. 2002 is available. (IAEA)

  19. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  20. Statistical modelling of mitochondrial power supply.

    Science.gov (United States)

    James, A T; Wiskich, J T; Conyers, R A

    1989-01-01

    By experiment and theory, formulae are derived to calculate the response of mitochondrial power supply, in flux and potential, to an ATP consuming enzyme load, incorporating effects of varying amounts of (i) enzyme, (ii) total circulating adenylate, and (iii) inhibition of the ATP/ADP translocase. The formulae, which apply between about 20% and 80% of maximum respiration, are the same as for the current and voltage of an electrical circuit in which a battery with potential, linear in the logarithm of the total adenylate, charges another battery whose opposing potential is also linear in the same logarithm, through three resistances. These resistances produce loss of potential due to dis-equilibrium of (i) intramitochondrial oxidative phosphorylation, (ii) the ATP/ADP translocase, and (iii) the ATP-consuming enzyme load. The model is represented geometrically by the following configuration: when potential is plotted against flux, the points lie on two pencils of lines each concurrent at zero respiration, the two pencils describing the respective characteristics of the mitochondrion and enzyme. Control coefficients and elasticities are calculated from the formulae. PMID:2708917

  1. Using Tree Diagrams as an Assessment Tool in Statistics Education

    Science.gov (United States)

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  2. WHAT IS THE MAJOR POWER LINKING STATISTICS & DATA MINING ?

    Directory of Open Access Journals (Sweden)

    M.E. Abd El-Monsef

    2013-11-01

    Full Text Available In the recent years, numerous scientific research studies which stand for the intersecting disciplines between statistics and data mining (DM are obtained [17, 18, 19, 24, 27, 30, 35]. This paper is devoted to answer the titled suggested question which is based on five reply trends, the 1st trend based on an updated historical vision for each of statistics and DM. The 2nd trend is concerned with modern theoretical significant reply between statistics and DM. The major power linking statistics and DM is established in the 3rd trend. Lastly, the 4th trend represents a significant comparison between statistics & DM. A conceptual classification about Statistical Data Mining (SDM process in Egypt will be represented in the 5 th reply trend. Finally, the conclusion and the future work are represented.

  3. Replication unreliability in psychology: elusive phenomena or elusive statistical power?

    Directory of Open Access Journals (Sweden)

    Patrizio E Tressoldi

    2012-07-01

    Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.

  4. Statistical method for scientific projects risk assessment

    OpenAIRE

    Бедрій, Дмитро Іванович

    2013-01-01

    This article discusses the use of statistical methods for risk evaluation of the scientific institutions activity in the public sector of the Ukrainian economy in the process of planning and execution of scientific projects, some of the results of our research in this area are presented. The main objective of the study is to determine the possibility of using the statistical method in the process of evaluation of the research projects risks. The use of risk evaluation methods allows the manag...

  5. New Dynamical-Statistical Techniques for Wind Power Prediction

    Science.gov (United States)

    Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.

    2012-04-01

    The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.

  6. Statistical analysis of power ramp PCI test data

    International Nuclear Information System (INIS)

    Data from power ramp tests of reference standard fuel rods and PCI resistant fuel designs were analyzed statistically using the STATPAC computer program. Effects of design variations in the reference fuel are described. The significantly improved performance of zirconium liner fuel over copper barrier fuel and reference fuel is also shown. (author)

  7. Multivariate statistical assessment of coal properties

    Czech Academy of Sciences Publication Activity Database

    Klika, Z.; Serenčíšová, J.; Kožušníková, Alena; Kolomazník, I.; Študentová, S.; Vontorová, J.

    2014-01-01

    Roč. 128, č. 128 (2014), s. 119-127. ISSN 0378-3820 R&D Projects: GA MŠk ED2.1.00/03.0082 Institutional support: RVO:68145535 Keywords : coal properties * structural,chemical and petrographical properties * multivariate statistics Subject RIV: DH - Mining, incl. Coal Mining Impact factor: 3.352, year: 2014 http://dx.doi.org/10.1016/j.fuproc.2014.06.029

  8. Power Curve Modeling in Complex Terrain Using Statistical Models

    Science.gov (United States)

    Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.

    2014-12-01

    Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.

  9. Robust Statistical Detection of Power-Law Cross-Correlation

    Science.gov (United States)

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-06-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  10. Power laws statistics of cliff failures, scaling and percolation

    CERN Document Server

    Baldassarri, Andrea

    2014-01-01

    The size of large cliff failures may be described in several ways, for instance considering the horizontal eroded area at the cliff top and the maximum local retreat of the coastline. Field studies suggest that, for large failures, the frequencies of these two quantities decrease as power laws of the respective magnitudes, defining two different decay exponents. Moreover, the horizontal area increases as a power law of the maximum local retreat, identifying a third exponent. Such observation suggests that the geometry of cliff failures are statistically similar for different magnitudes. Power laws are familiar in the physics of critical systems. The corresponding exponents satisfy precise relations and are proven to be universal features, common to very different systems. Following the approach typical of statistical physics, we propose a "scaling hypothesis" resulting in a relation between the three above exponents: there is a precise, mathematical relation between the distributions of magnitudes of erosion ...

  11. Assessing photographer competence using face statistics

    Science.gov (United States)

    Greig, Darryl; Gao, Yuli

    2010-02-01

    The rapid growth of photo sharing websites has resulted in some new problems around the management of a large (and quickly increasing) number of photographers with different needs and usage characteristics. Despite significant advances in the field of computer vision, little has been done to leverage these technologies for photographer understanding and management, partly due to the high computational cost of extracting application-specific image features. Recently robust multi-view face detection technologies have been widely adopted by many photo sharing sites. This affords a limited but "standard" pre-computed set of face features to tackle these administrative problems in large scale settings. In this paper we present a principled statistical model to alleviate one such administrative task - the automatic analysis of photographer competency given only face detection results on a set of their photos. The model uses summary statistics to estimate the probability a given individual belongs to a population of high competence photographers over against a second population of lower competence photographers. Using this model, we have achieved high classification accuracy (respectively 84.3% and 90.9%) on two large image datasets. We discuss an application of this approach to assist in managing a photo-sharing website.

  12. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  13. "Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"

    Science.gov (United States)

    Konstantopoulos, Spyros

    2009-01-01

    Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…

  14. Statistical reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)

  15. Statistical quality assessment of a fingerprint

    Science.gov (United States)

    Hwang, Kyungtae

    2004-08-01

    The quality of a fingerprint is essential to the performance of AFIS (Automatic Fingerprint Identification System). Such a quality may be classified by clarity and regularity of ridge-valley structures.1,2 One may calculate thickness of ridge and valley to measure the clarity and regularity. However, calculating a thickness is not feasible in a poor quality image, especially, severely damaged images that contain broken ridges (or valleys). In order to overcome such a difficulty, the proposed approach employs the statistical properties in a local block, which involve the mean and spread of the thickness of both ridge and valley. The mean value is used for determining whether a fingerprint is wet or dry. For example, the black pixels are dominant if a fingerprint is wet, the average thickness of ridge is larger than one of valley, and vice versa on a dry fingerprint. In addition, a standard deviation is used for determining severity of damage. In this study, the quality is divided into three categories based on two statistical properties mentioned above: wet, good, and dry. The number of low quality blocks is used to measure a global quality of fingerprint. In addition, a distribution of poor blocks is also measured using Euclidean distances between groups of poor blocks. With this scheme, locally condensed poor blocks decreases the overall quality of an image. Experimental results on the fingerprint images captured by optical devices as well as by a rolling method show the wet and dry parts of image were successfully captured. Enhancing an image by employing morphology techniques that modifying the detected poor quality blocks is illustrated in section 3. However, more work needs to be done on designing a scheme to incorporate the number of poor blocks and their distributions for a global quality.

  16. Statistical analyses support power law distributions found in neuronal avalanches.

    Directory of Open Access Journals (Sweden)

    Andreas Klaus

    Full Text Available The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii model parameter estimation to determine the specific exponent of the power law, and (iii comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect. This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

  17. Assessment of alternatives to correct inventory difference statistical treatment deficiencies

    International Nuclear Information System (INIS)

    This document presents an analysis of alternatives to correct deficiencies in the statistical treatment of inventory differences in the NRC guidance documents and licensee practice. Pacific Northwest Laboratory's objective for this study was to assess alternatives developed by the NRC and a panel of safeguards statistical experts. Criteria were developed for the evaluation and the assessment was made considering the criteria. The results of this assessment are PNL recommendations, which are intended to provide NRC decision makers with a logical and statistically sound basis for correcting the deficiencies

  18. Cross-Cultural Instrument Translation: Assessment, Translation, and Statistical Applications

    Science.gov (United States)

    Mason, Teresa Crowe

    2005-01-01

    This article has four major sections: (a) general issues of assessment; (b) assessment of ethnic-group members, including those who are deaf; (c) translation of assessment tools, emphasizing translation into American Sign Language (ASL); and (d) statistical applications for translated instruments. The purpose of the article is to provide insight…

  19. Self-assessed performance improves statistical fusion of image labels

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2014-03-15

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  20. Self-assessed performance improves statistical fusion of image labels

    International Nuclear Information System (INIS)

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  1. Enrichment of statistical power for genome-wide association studies

    OpenAIRE

    Li, Meng; Liu, Xiaolei; Bradbury, Peter; Yu, Jianming; Zhang, Yuan-Ming; Todhunter, Rory J.; Buckler, Edward S; Zhang, Zhiwu

    2014-01-01

    Background The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most flexible and powerful for controlling population structure and individual unequal relatedness (kinship), the two common causes of spurious associations. The introduction of the compressed ...

  2. Quantitative statistical methods for image quality assessment.

    Science.gov (United States)

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  3. Quality Assessment and Improvement Methods in Statistics – what Works?

    Directory of Open Access Journals (Sweden)

    Hans Viggo Sæbø

    2014-12-01

    Full Text Available Several methods for quality assessment and assurance in statistics have been developed in a European context. Data Quality Assessment Methods (DatQAM were considered in a Eurostat handbook in 2007. These methods comprise quality reports and indicators, measurement of process variables, user surveys, self-assessments, audits, labelling and certifi cation. The entry point for the paper is the development of systematic quality work in European statistics with regard to good practices such as those described in the DatQAM handbook. Assessment is one issue, following up recommendations and implementation of improvement actions another. This leads to a discussion on the eff ect of approaches and tools: Which work well, which have turned out to be more of a challenge, and why? Examples are mainly from Statistics Norway, but these are believed to be representative for several statistical institutes.

  4. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  5. Visual and Statistical Analysis of Digital Elevation Models Generated Using Idw Interpolator with Varying Powers

    Science.gov (United States)

    Asal, F. F.

    2012-07-01

    Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical

  6. APPLICATION OF THE UNIFIED STATISTICAL MATERIAL DATABASE FOR DESIGN AND LIFE/RISK ASSESSMENT OF HIGH TEMPERATURE COMPONENTS

    Institute of Scientific and Technical Information of China (English)

    K.Fujiyama; T.Fujiwara; Y.Nakatani; K.Saito; A.Sakuma; Y.Akikuni; S.Hayashi; S.Matsumoto

    2004-01-01

    Statistical manipulation of material data was conducted for probabilistic life assessment or risk-based design and maintenance for high temperature components of power plants. To obtain the statistical distribution of material properties, dominant parameters affecting material properties are introduced into normalization of statistical variables. Those parameters are hardness, chemical composition, characteristic microstructural features and so on. Creep and fatigue properties are expressed by normalized parameters and the unified statistical distributions are obtained. These probability distribution functions show good coincidence statistically with the field database of steam turbine components. It was concluded that the unified statistical baseline approach is useful for the risk management of components in power plants.

  7. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  8. Robustness of Spacing-based Power Divergence Statistics

    Czech Academy of Sciences Publication Activity Database

    Boček, Pavel

    Praha : ÚTIA AVČR, v.v.i, 2011 - (Janžura, M.; Ivánek, J.). s. 23-23 [7th International Workshop on Data - Algorithms - Decision Making. 27.11.2011-29.11.2011, Mariánská] R&D Projects: GA MŠk 1M0572; GA ČR GAP202/10/0618 Institutional research plan: CEZ:AV0Z10750506 Keywords : alpha-divergence * goodness-of-fit Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2011/SI/bocek- robustness of spacing-based power divergence statistics.pdf

  9. Demographic statistics pertaining to nuclear power reactor sites

    International Nuclear Information System (INIS)

    Population statistics are presented for 145 nuclear power plant sites. Summary tables and figures are included that were developed to aid in the evaluation of trends and general patterns associated with the various parameters of interest, such as the proximity of nuclear plant sites to centers of population. The primary reason for publishing this information at this time is to provide a factual basis for use in discussions on the subject of reactor siting policy. The report is a revised and updated version of a draft report published in December 1977. Errors in the population data base have been corrected and new data tabulations added

  10. HVDC power transmission technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, R.L.; Tatro, P.J.; Railing, B.D. [New England Power Service Co., Westborough, MA (United States); Johnson, B.K.; Stewart, J.R. [Power Technologies, Inc., Schenectady, NY (United States); Fink, J.L.

    1997-04-01

    The purpose of this study was to develop an assessment of the national utility system`s needs for electric transmission during the period 1995-2020 that could be met by future reduced-cost HVDC systems. The assessment was to include an economic evaluation of HVDC as a means for meeting those needs as well as a comparison with competing technologies such as ac transmission with and without Flexible AC Transmission System (FACTS) controllers. The role of force commutated dc converters was to be assumed where appropriate. The assessment begins by identifying the general needs for transmission in the U.S. in the context of a future deregulated power industry. The possible roles for direct current transmission are then postulated in terms of representative scenarios. A few of the scenarios are illustrated with the help of actual U.S. system examples. non-traditional applications as well as traditional applications such as long lines and asynchronous interconnections are discussed. The classical ``break-even distance`` concept for comparing HVDC and ac lines is used to assess the selected scenarios. The impact of reduced-cost converters is reflected in terms of the break-even distance. This report presents a comprehensive review of the functional benefits of HVDC transmission and updated cost data for both ac and dc system components. It also provides some provocative thoughts on how direct current transmission might be applied to better utilize and expand our nation`s increasingly stressed transmission assets.

  11. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  12. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  13. Statistical problems in the assessment of nuclear risks

    International Nuclear Information System (INIS)

    Information on nuclear power plant risk assessment is presented concerning attitudinal problems; and methodological problems involving expert opinions, human error probabilities, nonindependent events, uncertainty analysis, and acceptable risk criteria

  14. Toward improved statistical treatments of wind power forecast errors

    Science.gov (United States)

    Hart, E.; Jacobson, M. Z.

    2011-12-01

    The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.

  15. Alternative Assessment in Higher Education: An Experience in Descriptive Statistics

    Science.gov (United States)

    Libman, Zipora

    2010-01-01

    Assessment-led reform is now one of the most widely favored strategies to promote higher standards of teaching, more powerful learning and more credible forms of public accountability. Within this context of change, higher education in many countries is increasingly subjected to demands to implement alternative assessment strategies that provide…

  16. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S.; Antoniou, I.; Dahlberg, J.A. [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  17. Prediction of lacking control power in power plants using statistical models

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob

    2007-01-01

    Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance of...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...

  18. Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows

    DEFF Research Database (Denmark)

    Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco;

    2012-01-01

    In view of the increasing penetration of wind power in a number of power systems and markets worldwide, we discuss some of the impacts that wind energy may have on market quantities and cross-border power flows. These impacts are uncovered through statistical analyses of actual market and flow data...... in Europe. Due to the dimensionality and nonlinearity of these effects, the necessary concepts of dimension reduction using Principal Component Analysis (PCA), as well as nonlinear regression are described. Example application results are given for European cross-border flows, as well as for the...

  19. Statistical analysis of occupational exposure in nuclear power plants

    International Nuclear Information System (INIS)

    Occupational doses have wide variations from zero to high values on the logarithmic scale, according to the workers jobs. However as radiation control programmes constrain higher exposures more, the variation of higher doses changes from log to linear scale, while the structure of lower doses remain. In the paper we analyse the annual effective doses of workers in 3 nuclear power plants of Jaslovske Bohunice using various distribution models. The hybrid-lognormal description of the annual dose distribution makes it possible to assess also the annual collective doses below the adopted recording level. Two methods of analysing the 'lost' occupational collective doses are presented

  20. Earthquake accelerogram simulation with statistical law of evolutionary power spectrum

    Institute of Scientific and Technical Information of China (English)

    ZHANG Cui-ran; CHEN Hou-qun; LI Min

    2007-01-01

    By using the technique for evolutionary power spectrum proposed by Nakayama and with reference to the Kameda formula, an evolutionary spectrum prediction model for given earthquake magnitude and distance is established based on the 80 near-source acceleration records at rock surface with large magnitude from the ground motion database of western U.S.. Then a new iteration method is developed for generation of random accelerograms non-stationary both in amplitude and frequency which are compatible with target evolutionary spectrum. The phase spectra of those simulated accelerograms are also non-stationary in time and frequency domains since the interaction between amplitude and phase angle has been considered during the generation. Furthermore, the sign of the phase spectrum increment is identified to accelerate the iteration. With the proposed statistical model for predicting evolutionary power spectra and the new method for generating compatible time history, the artificial random earthquake accelerograms non-stationary both in amplitude and frequency for certain magnitude and distance can be provided.

  1. Statistics

    International Nuclear Information System (INIS)

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO2-emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  2. Statistics

    International Nuclear Information System (INIS)

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO2-emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  3. Efficient computation and statistical assessment of transfer entropy

    Directory of Open Access Journals (Sweden)

    Patrick eBoba

    2015-03-01

    Full Text Available The analysis of complex systems frequently poses the challenge to distinguish correlation from causation. Statistical physics hasinspired very promising approaches to search for correlations in time series; the transfer entropy in particular (Hlavackova-Schindler et al., 2007. Now, methods from computational statistics can quantitatively assign significance to such correlation measures. In this study, we propose and apply a procedure to statistically assess transfer entropies by one-sided tests. We introduce to null models of vanishing correlations for time series with memory.We implemented them in an OpenMP-based, parallelized C++ package for multi-core CPUs. Using template meta-programming, we enable a compromise between memory and run time efficiency.

  4. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such...... tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators are...

  5. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    OpenAIRE

    R. Eric Heidel

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of t...

  6. A New Statistic Approach towards Landslide Hazard Risk Assessment

    OpenAIRE

    George Gaprindashvili; Jianping Guo; Panisara Daorueang; Tian Xin; Pooyan Rahimy

    2014-01-01

    To quantitatively assess the landslide hazard in Khelvachauri, Georgia, the statistic method of hazard index was applied. A spatial database was constructed in Geographic Information System (GIS) including topographic data, geologic maps, land-use, and active landslide events (extracted from the landslide inventory). After that, causal factors of landslides (such as slope, aspect, lithology, geomorphology, land-use and soil depth) were produced to calculate the correspo...

  7. Assessing Budget Support with Statistical Impact Evaluation: a Methodological Proposal

    OpenAIRE

    Elbers, Chris; Gunning, Jan Willem; de Hoop, Kobus

    2007-01-01

    Donor agencies and recipient governments want to assess the effectiveness of aid-supported sector policies. Unfortunately, existing methods for impact evaluation are designed for the evaluation of homogeneous interventions (‘projects’) where those with and without ‘treatment’ can be compared. The lack of a methodology for evaluations of sector-wide programs is a serious constraint in the debate on aid effectiveness. We propose a method of statistical impact evaluation in situations with heter...

  8. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  9. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  10. Statistical analysis about corrosion in nuclear power plants

    International Nuclear Information System (INIS)

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  11. The Statistics Concept Inventory: Development and analysis of a cognitive assessment instrument in statistics

    Science.gov (United States)

    Allen, Kirk

    The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented

  12. The power and statistical behaviour of allele-sharing statistics when applied to models with two disease loci

    Indian Academy of Sciences (India)

    Yin Y. Shugart; Bing-Jian Feng; Andrew Collins

    2002-11-01

    We have evaluated the power for detecting a common trait determined by two loci, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single disease locus, we have used a simulated data set of general pedigrees in which a two-locus disease is segregating and evaluated several non-parametric linkage statistics implemented in the two programs. We found that the power for detecting linkage using the $S_{\\text{all}}$ statistic in GENEHUNTER (GH, version 2.1), implemented as statistic in SimWalk2 (version 2.82), is different in the two. The values associated with statistic output by SimWalk2 are consistently more conservative than those from GENEHUNTER except when the underlying model includes heterogeneity at a level of 50% where the values output are very comparable. On the other hand, when the thresholds are determined empirically under the null hypothesis, $S_{\\text{all}}$ in GENEHUNTER and statistic have similar power.

  13. A statistical proposal for environmental impact assessment of development projects

    International Nuclear Information System (INIS)

    Environmental impact assessment of development projects is a fundamental process, which main goal is to avoid that their construction and functioning, lead to serious and negative consequences on the environment. Some of the most important limitations of the models employed to assess environmental impacts, are the subjectivity of its parameters and weights, and the multicolineality among the variables, which represent high quantities of similar information. This work presents a multivariate statistical-based method that tries to diminish such limitations. For this purpose, environmental impact assessment, is valuated through different environmental impact attributes and environmental elements, synthesized in an environmental quality index (ICA in Spanish). ICA can be applied at different levels, such as at a project level, or applied only at a partial level on one or some environmental components.

  14. Environmental Assessment for power marketing policy for Southwestern Power Administration

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described.

  15. Environmental Assessment for power marketing policy for Southwestern Power Administration

    International Nuclear Information System (INIS)

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described

  16. Statistical analysis of fire events at US nuclear power plants

    International Nuclear Information System (INIS)

    The concern about fires as a potential agent of common cause failure in NPPs has greatly increased since the Browns Ferry NPP fire. Several regulatory actions were initiated following this incident. In investigating the chances of fire incident leading to core melt it is found that the unconditional frequency is about 1x10 incidents per reactor-year. The detailed reviews of fire events at nuclear plants are used in quantifying fire occurrence frequency required to carry out fire risk assessment. In this work the results of a statistical analysis of 354 fire incidents at US NPPs in the period from January 1965 to June 1985 are presented to quantify fire occurrence frequency. The distribution of fire incidents between the different types of NPPs (PWR, BWR or HTGR), the mode of plant operation, the probable cause of fire, the type of detectors detect the incident, who extinguished the fire, suppression equipment, suppression agent, the initiating combustible, the component or components affected by fire are all analysed for the studied 354 fire incidents. More than 50% of the incidents occurred during the construction phase, in many of them there is neither nuclear problem nor any safety problem, however these incidents delayed the startup of the units up to 2 years as happened in Indian Point unit 2 (1971). There are four major fire incidents at US NPPS in the first period of the study (1965-1978), not one of them in the last seven years (1979-1985) which clarify the development in the fire protection measures and technology. The fire events in US (NPPS) can be summarized in about 354 incidents at 33 locations due to 38 causes of fire with 0.17 fire events/plant/year

  17. The number of Guttman errors as a simple and powerful person-fit statistic

    OpenAIRE

    Meijer, Rob R.

    1994-01-01

    A number of studies have examined the power of several statistics that can be used to detect examinees with unexpected (nonfitting) item score patterns, or to determine person fit. This study compared the power of the U3 statistic with the power of one of the simplest person-fit statistics, the sum of the number of Guttman errors. In most cases studied, (a weighted version of) the latter statistic performed as well as the U3 statistic. Counting the number of Guttman errors seems to be a usefu...

  18. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    Background: Clinical research often involves study of agreement amongst observers. Agreement can be measured in different ways, and one can obtain quite different values depending on which method one uses. Objective: We review the approaches that have been discussed to assess the agreement betwee......-moment correlation coefficient (r) between the results of the two measurements methods as an indicator of agreement, which is wrong. There have been proposed several alternative methods, which we will describe together with preconditions for use of the methods....... continuous measures and discuss their strengths and weaknesses. Different methods are illustrated using actual data from the `Delay in diagnosis of cancer in general practice´ project in Aarhus, Denmark. Subjects and Methods: We use weighted kappa-statistic, intraclass correlation coefficient (ICC......), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product...

  19. Nuclear power plant insurance - experience and loss statistics

    International Nuclear Information System (INIS)

    Nuclear power plants are treated separately when concluding insurance contracts. National insurance pools have been established in industrial countries, co-operating on an international basis, for insuring a nuclear power plant. In combined property insurance, the nuclear risk is combined with the fire risk. In addition, there are the engineering insurances. Of these, the one of significance for nuclear power plants is the machinery insurance, which can be covered on the free insurance market. Nuclear power plants have had fewer instances of damage than other, conventional installations. (orig.)

  20. Fundamentals of modern statistical methods substantially improving power and accuracy

    CERN Document Server

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  1. Comparative environmental assessment of unconventional power installations

    Science.gov (United States)

    Sosnina, E. N.; Masleeva, O. V.; Kryukov, E. V.

    2015-08-01

    Procedure of the strategic environmental assessment of the power installations operating on the basis of renewable energy sources (RES) was developed and described. This procedure takes into account not only the operational process of the power installation but also the whole life cycles: from the production and distribution of power resources for manufacturing of the power installations to the process of their recovery. Such an approach gives an opportunity to make a more comprehensive assessment of the influence of the power installations on environments and may be used during adaptation of the current regulations and development of new regulations for application of different types of unconventional power installations with due account of the ecological factor. Application of the procedure of the integrated environmental assessment in the context of mini-HPP (Hydro Power Plant); wind, solar, and biogas power installations; and traditional power installation operating natural gas was considered. Comparison of environmental influence revealed advantages of new energy technologies compared to traditional ones. It is shown that solar energy installations hardly pollute the environment during operation, but the negative influence of the mining operations and manufacturing and utilization of the materials used for solar modules is maximum. Biogas power installations are on the second place as concerns the impact on the environment due to the considerable mass of the biogas installation and gas reciprocating engine. The minimum impact on the environment is exerted by the mini-HPP. Consumption of material and energy resources for the production of the traditional power installation is less compared to power installations on RES; however, this factor incomparably increases when taking into account the fuel extraction and transfer. The greatest impact on the environment is exerted by the operational process of the traditional power installations.

  2. Statistical utility theory for comparison of nuclear versus fossil power plant alternatives

    International Nuclear Information System (INIS)

    A statistical formulation of utility theory is developed for decision problems concerned with the choice among alternative strategies in electric energy production. Four alternatives are considered: nuclear power, fossil power, solar energy, and conservation policy. Attention is focused on a public electric utility thought of as a rational decision-maker. A framework for decisions is then suggested where the admissible strategies and their possible consequences represent the information available to the decision-maker. Once the objectives of the decision process are assessed, consequences can be quantified in terms of measures of effectiveness. Maximum expected utility is the criterion of choice among alternatives. Steps toward expected values are the evaluation of the multidimensional utility function and the assessment of subjective probabilities for consequences. In this respect, the multiplicative form of the utility function seems less restrictive than the additive form and almost as manageable to implement. Probabilities are expressed through subjective marginal probability density functions given at a discrete number of points. The final stage of the decision model is to establish the value of each strategy. To this scope, expected utilities are computed and scaled. The result is that nuclear power offers the best alternative. 8 figures, 9 tables, 32 references

  3. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  4. Thermohydraulic assessment of the RP-10 reactor core to determine the maximum power

    International Nuclear Information System (INIS)

    Thermohydraulic parameters assessment of the RP-10 reactor core from the most thermally demanded (hot channel). Determination of the operation thermal maximum power considering security margins and statistical treatment of uncertainty factors

  5. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    Science.gov (United States)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  6. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size.

    Science.gov (United States)

    Heidel, R Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717

  7. Using the statistical analysis method to assess the landslide susceptibility

    Science.gov (United States)

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting

    2015-04-01

    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  8. Safety assessment of emergency power systems for nuclear power plants

    International Nuclear Information System (INIS)

    This publication is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing the safety of a given design of the emergency power systems (EPS) for a nuclear power plant. The present publication refers closely to the NUSS Safety Guide 50-SG-D7 (Rev. 1), Emergency Power Systems at Nuclear Power Plants. It covers therefore exactly the same technical subject as that Safety Guide. In view of its objective, however, it attempts to help in the evaluation of possible technical solutions which are intended to fulfill the safety requirements. Section 2 clarifies the scope further by giving an outline of the assessment steps in the licensing process. After a general outline of the assessment process in relation to the licensing of a nuclear power plant, the publication is divided into two parts. First, all safety issues are presented in the form of questions that have to be answered in order for the assessor to be confident of a safe design. The second part presents the same topics in tabulated form, listing the required documentation which the assessor has to consult and those international and national technical standards pertinent to the topics. An extensive reference list provides information on standards. 1 tab

  9. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  10. Near and Far from Equilibrium Power-Law Statistics

    CERN Document Server

    Biro, Tamas S; Biro, Gabor; Shen, Ke Ming

    2016-01-01

    We analyze the connection between $p_T$ and multiplicity distributions in a statistical framework. We connect the Tsallis parameters, $T$ and $q$, to physical properties like average energy per particle and the second scaled factorial moment, $F_2=\\langle n(n-1) \\rangle / {\\langle n \\rangle}^2$, measured in multiplicity distributions. Near and far from equilibrium scenarios with master equations for the probability of having $n$ particles, $P_n$, are reviewed based on hadronization transition rates, $\\mu_n$, from $n$ to $n+1$ particles.

  11. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  12. Statistical Power of Psychological Research: What Have We Gained in 20 Years?

    Science.gov (United States)

    Rossi, Joseph S.

    1990-01-01

    Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…

  13. An assessment of recently published gene expression data analyses: reporting experimental design and statistical factors

    Directory of Open Access Journals (Sweden)

    Azuaje Francisco

    2006-06-01

    Full Text Available Abstract Background The analysis of large-scale gene expression data is a fundamental approach to functional genomics and the identification of potential drug targets. Results derived from such studies cannot be trusted unless they are adequately designed and reported. The purpose of this study is to assess current practices on the reporting of experimental design and statistical analyses in gene expression-based studies. Methods We reviewed hundreds of MEDLINE-indexed papers involving gene expression data analysis, which were published between 2003 and 2005. These papers were examined on the basis of their reporting of several factors, such as sample size, statistical power and software availability. Results Among the examined papers, we concentrated on 293 papers consisting of applications and new methodologies. These papers did not report approaches to sample size and statistical power estimation. Explicit statements on data transformation and descriptions of the normalisation techniques applied prior to data analyses (e.g. classification were not reported in 57 (37.5% and 104 (68.4% of the methodology papers respectively. With regard to papers presenting biomedical-relevant applications, 41(29.1 % of these papers did not report on data normalisation and 83 (58.9% did not describe the normalisation technique applied. Clustering-based analysis, the t-test and ANOVA represent the most widely applied techniques in microarray data analysis. But remarkably, only 5 (3.5% of the application papers included statements or references to assumption about variance homogeneity for the application of the t-test and ANOVA. There is still a need to promote the reporting of software packages applied or their availability. Conclusion Recently-published gene expression data analysis studies may lack key information required for properly assessing their design quality and potential impact. There is a need for more rigorous reporting of important experimental

  14. Power-law distributions in economics: a nonextensive statistical approach

    CERN Document Server

    Queiros, S M D; Tsallis, C; Queiros, Silvio M. Duarte; Anteneodo, Celia; Tsallis, Constantino

    2005-01-01

    The cornerstone of Boltzmann-Gibbs ($BG$) statistical mechanics is the Boltzmann-Gibbs-Jaynes-Shannon entropy $S_{BG} \\equiv -k\\int dx f(x)\\ln f(x)$, where $k$ is a positive constant and $f(x)$ a probability density function. This theory has exibited, along more than one century, great success in the treatment of systems where short spatio/temporal correlations dominate. There are, however, anomalous natural and artificial systems that violate the basic requirements for its applicability. Different physical entropies, other than the standard one, appear to be necessary in order to satisfactorily deal with such anomalies. One of such entropies is $S_q \\equiv k (1-\\int dx [f(x)]^q)/(1-q)$ (with $S_1=S_{BG}$), where the entropic index $q$ is a real parameter. It has been proposed as the basis for a generalization, referred to as {\\it nonextensive statistical mechanics}, of the $BG$ theory. $S_q$ shares with $S_{BG}$ four remarkable properties, namely {\\it concavity} ($\\forall q>0$), {\\it Lesche-stability} ($\\for...

  15. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  16. Statistical modeling and analysis of the influence of antenna polarization error on received power

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The problem of statistical modeling of antenna polarization error is studied and the statistical characteristics of antenna's received power are analyzed. A novel Stokes-vector-based method is presented to describe the conception of antenna's polarization purity. Statistical model of antenna's polarization error in polarization domain is then built up. When an antenna with polarization error of uniform distribution is illuminated by an arbitrary polarized incident field, the probability density of antenna's received power is derived analytically. Finally, a group of curves of deviation and standard deviation of received power are plotted numerically.

  17. Estimating statistical power for open-enrollment group treatment trials.

    Science.gov (United States)

    Morgan-Lopez, Antonio A; Saavedra, Lissette M; Hien, Denise A; Fals-Stewart, William

    2011-01-01

    Modeling turnover in group membership has been identified as a key barrier contributing to a disconnect between the manner in which behavioral treatment is conducted (open-enrollment groups) and the designs of substance abuse treatment trials (closed-enrollment groups, individual therapy). Latent class pattern mixture models (LCPMMs) are emerging tools for modeling data from open-enrollment groups with membership turnover in recently proposed treatment trials. The current article illustrates an approach to conducting power analyses for open-enrollment designs based on the Monte Carlo simulation of LCPMM models using parameters derived from published data from a randomized controlled trial comparing Seeking Safety to a Community Care condition for women presenting with comorbid posttraumatic stress disorder and substance use disorders. The example addresses discrepancies between the analysis framework assumed in power analyses of many recently proposed open-enrollment trials and the proposed use of LCPMM for data analysis. PMID:20832971

  18. Ground assessment methods for nuclear power plant

    International Nuclear Information System (INIS)

    It is needless to say that nuclear power plant must be constructed on the most stable and safe ground. Reliable assessment method is required for the purpose. The Ground Integrity Sub-committee of the Committee of Civil Engineering of Nuclear Power Plant started five working groups, the purpose of which is to systematize the assessment procedures including geological survey, ground examination and construction design. The works of working groups are to establishing assessment method of activities of faults, standardizing the rock classification method, standardizing assessment and indication method of ground properties, standardizing test methods and establishing the application standard for design and construction. Flow diagrams for the procedures of geological survey, for the investigation on fault activities and ground properties of area where nuclear reactor and important outdoor equipments are scheduled to construct, were established. And further, flow diagrams for applying investigated results to design and construction of plant, and for determining procedure of liquidification nature of ground etc. were also established. These systematized and standardized methods of investigation are expected to yield reliable data for assessment of construction site of nuclear power plant and lead to the safety of construction and operation in the future. In addition, the execution of these systematized and detailed preliminary investigation for determining the construction site of nuclear power plant will make much contribution for obtaining nation-wide understanding and faith for the project. (Ishimitsu, A.)

  19. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    Science.gov (United States)

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  20. Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?

    OpenAIRE

    Tressoldi, Patrizio E.

    2012-01-01

    The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related t...

  1. Statistical study of high energy radiation from rotation-powered pulsars

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on our self-consistent outer gap model for high energy emission from the rotation-powered pulsars, we study the statistical properties of X-ray and γ-ray emission from the rotation-powered pulsars, and other statistical properties (e.g. diffuse γ-ray background and unidentified γ-ray point sources) related to γ-ray pulsars in our Galaxy and nearby galaxies are also considered.

  2. Statistics

    International Nuclear Information System (INIS)

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO2-emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. Violation of statistical isotropy and homogeneity in the 21-cm power spectrum

    CERN Document Server

    Shiraishi, Maresuke; Kamionkowski, Marc; Raccanelli, Alvise

    2016-01-01

    Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic-Microwave-Background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm-line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving u...

  4. SIESE - trimestrial bulletin - Synthesis 1995. Electric power summary statistics for Brazil

    International Nuclear Information System (INIS)

    This bulletin presents the electric power summary statistics, which cover the performance of the power system for the whole of the utilities in 1995. It offers tables with revised data concerning the last two years based on updated information supplied by both the electric utilities and the SIESE's responsibility centers. 6 figs., 36 tabs

  5. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    Science.gov (United States)

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  6. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating the...

  7. On the power for linkage detection using a test based on scan statistics.

    Science.gov (United States)

    Hernández, Sonia; Siegmund, David O; de Gunst, Mathisca

    2005-04-01

    We analyze some aspects of scan statistics, which have been proposed to help for the detection of weak signals in genetic linkage analysis. We derive approximate expressions for the power of a test based on moving averages of the identity by descent allele sharing proportions for pairs of relatives at several contiguous markers. We confirm these approximate formulae by simulation. The results show that when there is a single trait-locus on a chromosome, the test based on the scan statistic is slightly less powerful than that based on the customary allele sharing statistic. On the other hand, if two genes having a moderate effect on a trait lie close to each other on the same chromosome, scan statistics improve power to detect linkage. PMID:15772104

  8. Ten-year statistics of the electric power supply. Status and tendencies

    International Nuclear Information System (INIS)

    The ten-year statistics of the electric power supply in Denmark for 1992-2001 presents in tables and figures the trend of the electric power supply sector during the last ten years. The tables and figures present information on total energy consumption, combined heat and power generation, fuel consumption and the environment, the technical systems, economy and pricing, organization of the electricity supply, and information on electricity prices and taxes for households and industry in various countries. (LN)

  9. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  10. Nuclear power plant security assessment technical manual.

    Energy Technology Data Exchange (ETDEWEB)

    O' Connor, Sharon L.; Whitehead, Donnie Wayne; Potter, Claude S., III

    2007-09-01

    This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and

  11. Nuclear power plant security assessment technical manual

    International Nuclear Information System (INIS)

    This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and

  12. Statistical-Based Joint Power Control for Wireless Ad Hoc CDMA Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANGShu; RONGMongtian; CHENBo

    2005-01-01

    Current power control algorithm for CDMA-based ad hoc networks contains SIR and interference measurement, which is based on history information. However, for the traffics in today's or future's network, important statistical property is burstiness. As a consequence, the interference at a given receiving node may fluctuate dramatically. So the convergence speed of power control is not fast and the performance degrades. This paper presents a joint power control model. To a receiving node, all transmitting nodes assigned in same time slot adjust their transmitter power based on current information, which takes into account the adjustments of transmitter power of other transmitting nodes. Based on the joint power control model, this paper proposes a statisticalbased power control algorithm. Through this new algorithm, the interference is estimated more exactly. The simulation results indicated that the proposed power control algorithm outperforms the old algorithm.

  13. Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd

    2011-01-01

    The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…

  14. Assessing Knowledge Structures in a Constructive Statistical Learning Environment

    NARCIS (Netherlands)

    P.P.J.L. Verkoeijen (Peter); Tj. Imbos; M.W.J. van de Wiel (Margje); M.P.F. Berger; H.G. Schmidt (Henk)

    2002-01-01

    textabstractIn this report, the method of free recall is put forward as a tool to evaluate a prototypical statistical learning environment. A number of students from the faculty of Health Sciences, Maastricht University, the Netherlands, were required to write down whatever they could remember of a

  15. Computer-aided assessment in statistics: the CAMPUS project

    OpenAIRE

    Hunt, Neville

    1998-01-01

    The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assess...

  16. Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon

    2016-01-01

    The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including sample size, effect size of mediated effect, number of measurement occasions, and R2 of measured variables. In general, the results showed that relatively large samples were needed to accurately estimate the mediated effects and to have adequate statistical power, when testing mediation in the LGCM framework. Guidelines for designing studies to examine longitudinal mediation and ways to improve the accuracy of the estimates and statistical power were discussed.

  17. Using DEWIS and R for Multi-Staged Statistics e-Assessments

    Science.gov (United States)

    Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.

    2016-01-01

    We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…

  18. Geotechnical assessments of upgrading power transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Andrew [Coffey Geotechnics Ltd., Harrogate (United Kingdom)

    2012-11-01

    One of the consequences of increasing demand for energy is a corresponding requirement for increased energy distribution. This trend is likely to be magnified by the current tendency to generate power in locations remote from centres of population. New power transmission routes are expensive and awkward to develop, and there are therefore benefits to be gained by upgrading existing routes. However, this in turn raises problems of a different nature. The re-use of any structure must necessarily imply the acceptance of unknowns. The upgrading of transmission lines is no exception to this, particularly when assessing foundations, which in their nature are not visible. A risk-based approach is therefore used. This paper describes some of the geotechnical aspects of the assessment of electric power transmission lines for upgrading. It briefly describes the background, then discusses some of the problems encountered and the methods used to address them. These methods are based mainly on information obtained from desk studies and walkover surveys, with a limited amount of intrusive investigation. (orig.)

  19. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  20. Statistical and RBF NN models : providing forecasts and risk assessment

    OpenAIRE

    Marček, Milan

    2009-01-01

    Forecast accuracy of economic and financial processes is a popular measure for quantifying the risk in decision making. In this paper, we develop forecasting models based on statistical (stochastic) methods, sometimes called hard computing, and on a soft method using granular computing. We consider the accuracy of forecasting models as a measure for risk evaluation. It is found that the risk estimation process based on soft methods is simplified and less critical to the question w...

  1. JRC Statistical Assessment of the 2015 ICT Development Index

    OpenAIRE

    SAISANA Michaela; DOMINGUEZ TORREIRO MARCOS

    2015-01-01

    Since 2009, the International Telecommunication Union (ITU) has been publishing its annual ICT Development Index (IDI), which benchmarks countries’ performance with regard to ICT infrastructure, use and skills. The JRC analysis, conducted at ITU’s invitation, suggests that the conceptualized three-level structure of the 2015 IDI is statistically sound in terms of coherence and balance, with the overall index as well as the three sub-indices – on ICT access, use and skills – being driven ...

  2. Assessing the South African Brain Drain, a Statistical Comparison

    OpenAIRE

    Jean-Baptiste Meyer; Mercy Brown; David Kaplan

    2000-01-01

    For several decades the analysis of the so-called brain drain has been hampered by measurement problems. It is now recognised that the official figures significantly underestimate the extent of the brain drain phenomenon and its increase since the political changes in the mid-1990's. This paper, using data from various reliable sources, provides new statistical evidence on the size of the brain drain from South Africa. It compares two methods used to arrive at a more realistic picture of the ...

  3. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    OpenAIRE

    Palatella, L.; Miglietta, M. M.; Paradisi, P.; Lionello, P.

    2010-01-01

    In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy), Ebro river basin (Spain), Po valley (Italy) and An- talya province (Turkey). We performed the statistical down- scaling using Canonical Correlation Analysis (CCA) in two versions: in one case Principal Component Analysis (PCA) filter is applied only to predictor and in the other to both pre- dictor and predictand. After performing a validation test, CCA after PCA filter on both ...

  4. Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.

    Science.gov (United States)

    Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G

    2016-04-01

    Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting. PMID:27209009

  5. Condition assessment of electrical power plant

    International Nuclear Information System (INIS)

    The large investments associated with main equipment items in electric power plants, both in terms of acquisition and conservation and in aspects such as safety, make it increasingly necessary and profitable to implement techniques for the monitoring and predictive assessment of the state of these equipment items. This paper highlights the benefits of applying such a programme to large electric equipment, describing in detail the technologies available for the evaluation and followup of the state of insulation, and the mechanical characteristics of large transformer windings. There is also a description of real cases where these technologies are used, showing the results obtained on equipment items which are in good condition and those which are damaged. The paper finally addresses actions resulting from these evaluation programmes, and applicable conclusions based on the large number of inspection techniques and tools that power plants can use nowadays to ensure continuous, reliable operation with optimised performance and reduced operating costs. (Author)

  6. Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve

    International Nuclear Information System (INIS)

    A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)

  7. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity.

    Science.gov (United States)

    Narayan, Manjari; Allen, Genevera I

    2016-01-01

    Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940

  8. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity

    Directory of Open Access Journals (Sweden)

    Manjari eNarayan

    2016-04-01

    Full Text Available Many complex brain disorders such as Autism Spectrum Disorders exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches --- R^2 based on resampling and random effects test statistics, and R^3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R^2 and R^3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in Autism Spectrum Disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.

  9. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity

    Science.gov (United States)

    Narayan, Manjari; Allen, Genevera I.

    2016-01-01

    Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.

  10. The power of alternative assessments (AAs)

    Institute of Scientific and Technical Information of China (English)

    张千茜

    2013-01-01

    This article starts by discussing the potential disadvantages of traditional assessment towards young English as a Second Language (ESL) learners within the American public school education system. In response to such disadvantages, researchers ’call for the implementation of alternative assessments (AAs) is therefore introduced along with the various benefits of AAs. However, the current mainstream education policy in the US, namely No Child Left Behind (NCLB) Policy, is still largely based on the tra-ditional ways of testing, making policy-oriented implementation of AAs on large scales remarkably difficult. After careful analysis, the author points out several implications concerning how, under such an existing policy of NCLB, can practitioners effectively accommodate young ESL learners by applying the power of AAs.

  11. Jacobian integration method increases the statistical power to measure gray matter atrophy in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Kunio Nakamura

    2014-01-01

    Full Text Available Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials.

  12. Atomic Bomb Survivors Life-Span Study: Insufficient Statistical Power to Select Radiation Carcinogenesis Model.

    Science.gov (United States)

    Socol, Yehoshua; Dobrzyński, Ludwik

    2015-01-01

    The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required. PMID:26673526

  13. Statistically Based Approach to Broadband Liner Design and Assessment

    Science.gov (United States)

    Nark, Douglas M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    A broadband liner design optimization includes utilizing in-duct attenuation predictions with a statistical fan source model to obtain optimum impedance spectra over a number of flow conditions for one or more liner locations in a bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners having impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increasing weighting to specific frequencies and/or operating conditions. One or more broadband design approaches are utilized to produce a broadband liner that targets a full range of frequencies and operating conditions.

  14. In vivo Comet assay – statistical analysis and power calculations of mice testicular cells

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne;

    2014-01-01

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary...... statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to......-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A...

  15. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  16. Computer-aided assessment in statistics: the CAMPUS project

    Directory of Open Access Journals (Sweden)

    Neville Hunt

    1998-12-01

    Full Text Available The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assessor. While computer-aided assessment (CAA has an enthusiastic following, it is not clear to many teachers that it either reduces workloads or reduces the risk of cheating. In an ideal world, most teachers would prefer to give individual attention and personal feedback to each student when marking their work. In this sense CAA must be seen as second best and will therefore be used only if it is seen to offer significant benefits in terms of reduced workloads or increased validity.

  17. Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon

    2011-01-01

    The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including…

  18. Using Classroom Assessment Techniques in an Introductory Statistics Class

    Science.gov (United States)

    Goldstein, Gary S.

    2007-01-01

    College instructors often provide students with only summative evaluations of their work, typically in the form of exam scores or paper grades. Formative evaluation, such as classroom assessment techniques (CATs), are rarer in higher education and provide an ongoing evaluation of students' progress. In this article, the author summarizes the use…

  19. A Teacher's Guide to Assessment Concepts and Statistics

    Science.gov (United States)

    Newman, Carole; Newman, Isadore

    2013-01-01

    The concept of teacher accountability assumes teachers will use data-driven decision making to plan and deliver appropriate and effective instruction to their students. In order to do so, teachers must be able to accurately interpret the data that is given to them, and that requires the knowledge of some basic concepts of assessment and…

  20. Environmental assessment of submarine power cables

    International Nuclear Information System (INIS)

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  1. Environmental assessment of submarine power cables

    Energy Technology Data Exchange (ETDEWEB)

    Isus, Daniel; Martinez, Juan D. [Grupo General Cable Sistemas, S.A., 08560-Manlleu, Barcelona (Spain); Arteche, Amaya; Del Rio, Carmen; Madina, Virginia [Tecnalia Research and Innovation, 20009 San Sebastian (Spain)

    2011-03-15

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  2. A comprehensive statistical assessment of star-planet interaction

    CERN Document Server

    Miller, Brendan P; Wright, Jason T; Pearson, Elliott G

    2014-01-01

    We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d450 Mjup/AU^2, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in Lx. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence.

  3. Violation of statistical isotropy and homogeneity in the 21-cm power spectrum

    Science.gov (United States)

    Shiraishi, Maresuke; Muñoz, Julian B.; Kamionkowski, Marc; Raccanelli, Alvise

    2016-05-01

    Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic microwave background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here, we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving upon the current CMB constraints by ˜2 orders of magnitude for a dipolar asymmetry and by ˜1 - 3 orders of magnitude for a quadrupolar asymmetry case.

  4. Statistical analysis of human maintenance failures of a nuclear power plant

    International Nuclear Information System (INIS)

    In this paper, a statistical study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components are especially prone to human failures. Many human failures were found in safety related systems. Similarly, several failures remained latent from outages to power operation. The safety significance was generally small. Modifications are an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more, in future. (orig.)

  5. Statistical aspects of bioequivalence assessment in the pharmaceutical industry.

    OpenAIRE

    Patterson, S. D.

    2003-01-01

    Since the early 1990's, average bioequivalence studies have served as the international standard for demonstrating that two formulations of drug product will provide the same therapeutic benefit and safety profile when used in the marketplace. Population (PBE) and Individual (IBE) bioequivalence have been the subject of intense international debate since methods for their assessment were proposed in the late 1980's. Guidance has been proposed by the Food and Drug Administration...

  6. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    . By use of statistical power analyses and demonstration of effect sizes, we emphasize that importance of empirical findings lies in “differences that make a difference” and not statistical significance tests per se. Finally we discuss the crucial assumption of randomness and question the presumption...

  7. Racialized customer service in restaurants: a quantitative assessment of the statistical discrimination explanatory framework.

    Science.gov (United States)

    Brewster, Zachary W

    2012-01-01

    Despite popular claims that racism and discrimination are no longer salient issues in contemporary society, racial minorities continue to experience disparate treatment in everyday public interactions. The context of full-service restaurants is one such public setting wherein racial minority patrons, African Americans in particular, encounter racial prejudices and discriminate treatment. To further understand the causes of such discriminate treatment within the restaurant context, this article analyzes primary survey data derived from a community sample of servers (N = 200) to assess the explanatory power of one posited explanation—statistical discrimination. Taken as a whole, findings suggest that while a statistical discrimination framework toward understanding variability in servers’ discriminatory behaviors should not be disregarded, the framework’s explanatory utility is limited. Servers’ inferences about the potential profitability of waiting on customers across racial groups explain little of the overall variation in subjects’ self-reported discriminatory behaviors, thus suggesting that other factors not explored in this research are clearly operating and should be the focus of future inquires. PMID:22379609

  8. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    Science.gov (United States)

    Palatella, L.; Miglietta, M. M.; Paradisi, P.; Lionello, P.

    2010-07-01

    In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy), Ebro river basin (Spain), Po valley (Italy) and Antalya province (Turkey). We performed the statistical downscaling using Canonical Correlation Analysis (CCA) in two versions: in one case Principal Component Analysis (PCA) filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP) is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500) as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.

  9. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    Directory of Open Access Journals (Sweden)

    L. Palatella

    2010-07-01

    Full Text Available In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy, Ebro river basin (Spain, Po valley (Italy and Antalya province (Turkey. We performed the statistical downscaling using Canonical Correlation Analysis (CCA in two versions: in one case Principal Component Analysis (PCA filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500 as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.

  10. A COMPREHENSIVE STATISTICAL ASSESSMENT OF STAR-PLANET INTERACTION

    International Nuclear Information System (INIS)

    We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d < 60 pc) planet-hosting solar analogs are combined with archival Chandra, XMM-Newton, and ROSAT coverage of 11 similar stars to construct a sample inoculated against inherent stellar class and planet-detection biases. Survival analysis and Bayesian regression methods (incorporating both measurements errors and X-ray upper limits; 13/23 stars have secure detections) are used to test whether ''hot Jupiter'' hosts are systematically more X-ray luminous than comparable stars with more distant or smaller planets. No significant correlations are present between common proxies for interaction strength (M P/a 2 or 1/a) versus coronal activity (L X or L X/L bol). In contrast, a sample of 198 FGK main-sequence stars does show a significant (∼99% confidence) increase in X-ray luminosity with M P/a 2. While selection biases are incontrovertibly present within the main-sequence sample, we demonstrate that the effect is primarily driven by a handful of extreme hot-Jupiter systems with M P/a 2 > 450 M Jup AU–2, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in L X. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence

  11. A COMPREHENSIVE STATISTICAL ASSESSMENT OF STAR-PLANET INTERACTION

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Brendan P.; Gallo, Elena; Pearson, Elliott G. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Wright, Jason T. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2015-02-01

    We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d < 60 pc) planet-hosting solar analogs are combined with archival Chandra, XMM-Newton, and ROSAT coverage of 11 similar stars to construct a sample inoculated against inherent stellar class and planet-detection biases. Survival analysis and Bayesian regression methods (incorporating both measurements errors and X-ray upper limits; 13/23 stars have secure detections) are used to test whether ''hot Jupiter'' hosts are systematically more X-ray luminous than comparable stars with more distant or smaller planets. No significant correlations are present between common proxies for interaction strength (M {sub P}/a {sup 2} or 1/a) versus coronal activity (L {sub X} or L {sub X}/L {sub bol}). In contrast, a sample of 198 FGK main-sequence stars does show a significant (∼99% confidence) increase in X-ray luminosity with M {sub P}/a {sup 2}. While selection biases are incontrovertibly present within the main-sequence sample, we demonstrate that the effect is primarily driven by a handful of extreme hot-Jupiter systems with M {sub P}/a {sup 2} > 450 M {sub Jup} AU{sup –2}, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in L {sub X}. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence.

  12. Waste Heat to Power Market Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Elson, Amelia [ICF International, Fairfax, VA (United States); Tidball, Rick [ICF International, Fairfax, VA (United States); Hampson, Anne [ICF International, Fairfax, VA (United States)

    2015-03-01

    Waste heat to power (WHP) is the process of capturing heat discarded by an existing process and using that heat to generate electricity. In the industrial sector, waste heat streams are generated by kilns, furnaces, ovens, turbines, engines, and other equipment. In addition to processes at industrial plants, waste heat streams suitable for WHP are generated at field locations, including landfills, compressor stations, and mining sites. Waste heat streams are also produced in the residential and commercial sectors, but compared to industrial sites these waste heat streams typically have lower temperatures and much lower volumetric flow rates. The economic feasibility for WHP declines as the temperature and flow rate decline, and most WHP technologies are therefore applied in industrial markets where waste heat stream characteristics are more favorable. This report provides an assessment of the potential market for WHP in the industrial sector in the United States.

  13. How Many Words Do You Know? An Integrated Assessment Task for Introductory Statistics Students

    Science.gov (United States)

    Warton, David I.

    2007-01-01

    A novel assignment exercise is described, in which students use a dictionary to estimate the size of their vocabulary. This task was developed for an introductory statistics service course, although it can be modified for use in survey sampling courses. The exercise can be used to simultaneously assess a range of core statistics skills: sample…

  14. Assessing the performance of statistical validation tools for megavariate metabolomics data

    NARCIS (Netherlands)

    Rubingh, C.M.; Bijlsma, S.; Derks, E.P.P.A.; Bobeldijk, I.; Verheij, E.R.; Kochhar, S.; Smilde, A.K.

    2006-01-01

    Statistical model validation tools such as cross-validation, jack-knifing model parameters and permutation tests are meant to obtain an objective assessment of the performance and stability of a statistical model. However, little is known about the performance of these tools for megavariate data set

  15. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  16. Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions

    OpenAIRE

    Pisarenko, V. F.; Sornette, D.

    2011-01-01

    We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent c...

  17. Statistical power of model selection strategies for genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Zheyang Wu

    2009-07-01

    Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the

  18. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation

    DEFF Research Database (Denmark)

    Hacke, Peter; Spataru, Sergiu

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated...... stress temperature, their use to determine the maximum power at 25°C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power...

  19. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  20. A Statistical Approach to Planning Reserved Electric Power for Railway Infrastructure Administration

    OpenAIRE

    M. Brabec; Pelikán, E. (Emil); Konár, O. (Ondřej); Kasanický, I.; Juruš, P. (Pavel); Sadil, J.; Blažek, P.

    2013-01-01

    One of the requirements on railway infrastructure administration is to provide electricity for day-to-day operation of railways. We propose a statistically based approach for the estimation of maximum 15-minute power within a calendar month for a given region. This quantity serves as a basis of contracts between railway infrastructure administration and electricity distribution system operator. We show that optimization of the prediction is possible, based on underlying loss function deriv...

  1. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    OpenAIRE

    Lee, Chaeyoung

    2012-01-01

    Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, wi...

  2. Assessment of ceramic composites for MMW space nuclear power systems

    International Nuclear Information System (INIS)

    Proposed multimegawatt nuclear power systems which operate at high temperatures, high levels of stress, and in hostile environments, including corrosive working fluids, have created interest in the use of ceramic composites as structural materials. This report assesses the applicability of several ceramic composites in both Brayton and Rankine cycle power systems. This assessment considers an equilibrium thermodynamic analysis and also a nonequilibrium assessment. (FI)

  3. Statistical analysis of high power microwave surface flashover delay times in nitrogen with metallic field enhancements

    International Nuclear Information System (INIS)

    The physical mechanisms that contribute to atmospheric breakdown induced by high power microwaves (HPMs) are of particular interest for the further development of high power microwave systems and related technologies. For a system in which HPM is produced in a vacuum environment for the purpose of radiating into atmosphere, it is necessary to separate the atmospheric environment from the vacuum environment with a dielectric interface. Breakdown across this interface on the atmospheric side and plasma development to densities prohibiting further microwave propagation are of special interest. In this paper, the delay time between microwave application and plasma emergence is investigated. Various external parameters, such as UV illumination or the presence of small metallic points on the surface, provide sources for electron field emission and influence the delay time which yields crucial information on the breakdown mechanisms involved. Due to the inherent statistical appearance of initial electrons and the statistics of the charge carrier amplification mechanisms, the flashover delay times deviate by as much as ±50% from the average, for the investigated case of discharges in N2 at pressures of 60-140 Torr and a microwave frequency of 2.85 GHz with 3 μs pulse duration, 50 ns pulse risetime, and MW/cm2 power densities. The statistical model described in this paper demonstrates how delay times for HPM surface flashover events can be effectively predicted for various conditions given sufficient knowledge about ionization rate coefficients as well as the production rate for breakdown initiating electrons.

  4. Statistical Design Model (SDM) of power supply and communication subsystem's Satellite

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza

    In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem

  5. Wind Power Assessment Based on a WRF Wind Simulation with Developed Power Curve Modeling Methods

    OpenAIRE

    Zhenhai Guo; Xia Xiao

    2014-01-01

    The accurate assessment of wind power potential requires not only the detailed knowledge of the local wind resource but also an equivalent power curve with good effect for a local wind farm. Although the probability distribution functions (pdfs) of the wind speed are commonly used, their seemingly good performance for distribution may not always translate into an accurate assessment of power generation. This paper contributes to the development of wind power assessment based on the wind speed...

  6. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    Science.gov (United States)

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  7. A spatial accuracy assessment of an alternative circular scan method for Kulldorff's spatial scan statistic

    OpenAIRE

    Read, S.; Bath, P.A.; Willett, P.; Maheswaran, R.

    2009-01-01

    This paper concerns the Bernoulli version of Kulldorff’s spatial scan statistic, and how accurately it identifies the exact centre of approximately circular regions of increased spatial density in point data. We present an alternative method of selecting circular regions that appears to give greater accuracy. Performance is tested in an epidemiological context using manifold synthetic case-control datasets. A small, but statistically significant, improvement is reported. The power of the alte...

  8. Efficient Statistical Leakage Power Analysis Method for Function Blocks Considering All Process Variations

    Institute of Scientific and Technical Information of China (English)

    LUO Zuying

    2007-01-01

    With technology scaling into nanometer regime, rampant process variations impact visible influences on leakage power estimation of very large scale integrations (VLSIs). In order to deal with the case of large inter- and intra-die variations, we induce a novel theory prototype of the statistical leakage power analysis (SLPA) for function blocks. Because inter-die variations can be pinned down into a small range but the number of gates in function blocks is large(>1000), we continue to simplify the prototype. At last, we induce the efficient methodology of SLPA. The method can save much running time for SLPA in the low power design since it is of the local-updating advantage. A large number of experimental data show that the method only takes feasible running time (0.32 s) to obtain accurate results (3 σ-error <0.5% on maximum) as function block circuits simultaneous suffer from 7.5%(3 σ/mean) inter-die and 7.5% intra-die length variations, which demonstrates that our method is suitable for statistical leakage power analysis of VLSIs under rampant process variations.

  9. Dose assessments in nuclear power plant siting

    International Nuclear Information System (INIS)

    This document is mainly intended to provide information on dose estimations and assessments for the purpose of nuclear power plant (NPP) siting. It is not aimed at giving radiation protection guidance, criteria or procedures to be applied during the process of NPP siting nor even to provide recommendations on this subject matter. The document may however be of help for implementing some of the Nuclear Safety Standards (NUSS) documents on siting. The document was prepared before April 26, 1986, when a severe accident at the Unit 4 of Chernobyl NPP in the USSR had occurred. It should be emphasized that this document does not bridge the gap which exists in the NUSS programme as far as radiation protection guidance for the specific case of siting of NPP is concerned. The Agency will continue to work on this subject with the aim to prepare a safety series document on radiation protection requirements for NPP siting. This document could serve as a working document for this purpose. Refs, figs and tabs

  10. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... is low. Tests based solely on the distribution of allele frequencies or the site frequency spectrum, such as the Ewens-Watterson test or Tajima's D, have less power in detecting both positive and negative selection because of the transient nature of positive selection and the weak signal left by negative...

  11. Efficient statistical analysis method of power/ground (P/G) network

    Institute of Scientific and Technical Information of China (English)

    Zuying Luo; Sheldon X.D. Tan

    2008-01-01

    In this paper, we propose an incremental statistical analysis method with complexity reduction as a pre-process for on-chip power/ground (P/G) networks. The new method exploits locality of P/G network analyses and aims at P/G networks with a large number of strongly connected subcircuits (called strong connects) such as trees and chains. The method consists of three steps. First it compresses P/G circuits by removing strong connects. As a result, current variations (CVs) of nodes in strong connects are transferred to some remain-ing nodes. Then based on the locality of power grid voltage responses to its current inputs, it efficiently calculates the correlative resistor (CR) matrix in a local way to directly compute the voltage variations by using small parts of the remaining circuit. Last it statistically recovers voltage variations of the suppressed nodes inside strong connects. This new method for statistically compressing and expanding strong connects in terms of current or voltage variations in a closed form is very efficient owning to its property of incremental analysis. Experimental results demonstrate that the method can efficiently compute low-bounds of voltage variations for P/G networks and it has two or three orders of magnitudes speedup over the traditional Monte-Carlo-based simulation method, with only 2.0% accuracy loss.

  12. Assessment - A Powerful Lever for Learning

    Directory of Open Access Journals (Sweden)

    Lorna Earl

    2010-05-01

    Full Text Available Classroom assessment practices have been part of schooling for hundreds of years. There are, however, new findings about the nature of learning and about the roles that assessment can play in enhancing learning for all students. This essay provides a brief history of the changing role of assessment in schooling, describes three different purposes for assessment and foreshadows some implications that shifting to a more differentiated view of assessment can have for policy, practice and research.

  13. The probability of identification: applying ideas from forensic statistics to disclosure risk assessment

    OpenAIRE

    Chris J. Skinner

    2007-01-01

    The paper establishes a correspondence between statistical disclosure control and forensic statistics regarding their common use of the concept of ‘probability of identification’. The paper then seeks to investigate what lessons for disclosure control can be learnt from the forensic identification literature. The main lesson that is considered is that disclosure risk assessment cannot, in general, ignore the search method that is employed by an intruder seeking to achieve disclosure. The effe...

  14. New statistical potential for quality assessment of protein models and a survey of energy functions

    OpenAIRE

    Rykunov Dmitry; Fiser Andras

    2010-01-01

    Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions,...

  15. Theoretical Foundations and Mathematical Formalism of the Power-Law Tailed Statistical Distributions

    Directory of Open Access Journals (Sweden)

    Giorgio Kaniadakis

    2013-09-01

    Full Text Available We present the main features of the mathematical theory generated by the √ κ-deformed exponential function expκ(x = ( 1 + κ2x2 + κx1/κ, with 0 ≤ κ < 1, developed in the last twelve years, which turns out to be a continuous one parameter deformation of the ordinary mathematics generated by the Euler exponential function. The κ-mathematics has its roots in special relativity and furnishes the theoretical foundations of the κ-statistical mechanics predicting power law tailed statistical distributions, which have been observed experimentally in many physical, natural and artificial systems. After introducing the κ-algebra, we present the associated κ-differential and κ-integral calculus. Then, we obtain the corresponding κ-exponential and κ-logarithm functions and give the κ-version of the main functions of the ordinary mathematics.

  16. Survey design, statistical analysis, and basis for statistical inferences in coastal habitat injury assessment: Exxon Valdez oil spill

    International Nuclear Information System (INIS)

    The objective of the Coastal Habitat Injury Assessment study was to document and quantify injury to biota of the shallow subtidal, intertidal, and supratidal zones throughout the shoreline affected by oil or cleanup activity associated with the Exxon Valdez oil spill. The results of these studies were to be used to support the Trustee's Type B Natural Resource Damage Assessment under the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). A probability based stratified random sample of shoreline segments was selected with probability proportional to size from each of 15 strata (5 habitat types crossed with 3 levels of potential oil impact) based on those data available in July, 1989. Three study regions were used: Prince William Sound, Cook Inlet/Kenai Peninsula, and Kodiak/Alaska Peninsula. A Geographic Information System was utilized to combine oiling and habitat data and to select the probability sample of study sites. Quasi-experiments were conducted where randomly selected oiled sites were compared to matched reference sites. Two levels of statistical inferences, philosophical bases, and limitations are discussed and illustrated with example data from the resulting studies. 25 refs., 4 figs., 1 tab

  17. Generation of statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd; Nielsen, Henrik Aalborg

    2007-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform on the...... development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...

  18. Application of probabilistic safety assessment for Macedonian electric power system

    International Nuclear Information System (INIS)

    Due to the complex and integrated nature of a power system, failures in any part of the system can cause interruptions, which range from inconveniencing a small number of local residents to a major and widespread catastrophic disruption of supply known as blackout. The objective of the paper is to show that the methods and tools of probabilistic safety assessment are applicable for assessment and improvement of real power systems. The method used in this paper is developed based on the fault tree analysis and is adapted for the power system reliability analysis. A particular power system i.e. the Macedonian power system is the object of the analysis. The results show that the method is suitable for application of real systems. The reliability of Macedonian power system assumed as the static system is assessed. The components, which can significantly impact the power system are identified and analysed in more details. (author)

  19. Hybrid algorithm for rotor angle security assessment in power systems

    OpenAIRE

    D. Prasad Wadduwage; Udaya D. Annakkage; Christine Qiong Wu

    2015-01-01

    Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA) in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on ...

  20. Assessing record linkage between health care and Vital Statistics databases using deterministic methods

    OpenAIRE

    Quan Hude; Li Bing; Fong Andrew; Lu Mingshan

    2006-01-01

    Abstract Background We assessed the linkage and correct linkage rate using deterministic record linkage among three commonly used Canadian databases, namely, the population registry, hospital discharge data and Vital Statistics registry. Methods Three combinations of four personal identifiers (surname, first name, sex and date of birth) were used to determine the optimal combination. The correct linkage rate was assessed using a unique personal health number available in all three databases. ...

  1. Application and interpretation of multiple statistical tests to evaluate validity of dietary intake assessment methods

    OpenAIRE

    Lombard, Martani J; Steyn, Nelia P; Charlton, Karen E; Senekal, Marjanne

    2015-01-01

    Background Several statistical tests are currently applied to evaluate validity of dietary intake assessment methods. However, they provide information on different facets of validity. There is also no consensus on types and combinations of tests that should be applied to reflect acceptable validity for intakes. We aimed to 1) conduct a review to identify the tests and interpretation criteria used where dietary assessment methods was validated against a reference method and 2) illustrate the ...

  2. Assessing power grid reliability using rare event simulation

    OpenAIRE

    Wadman, Wander

    2015-01-01

    Renewable energy generators such as wind turbines and solar panels supply more and more power in modern electrical grids. Although the transition to a sustainable power supply is desirable, considerable implementation of distributed and intermittent generators may strain the power grid. Since grid operators are responsible for a highly reliable power grid, they want to estimate to what extent violations of grid stability constraints occur. To assess grid reliability over a period of interest,...

  3. Grounding Locations Assessment of Practical Power System

    OpenAIRE

    Kousay Abdul Sattar; Ahmed M.A. Haidar; Nadheer A. Shalash

    2012-01-01

    Grounding Points (GPs) are installed in electrical power system to drive protective devices and accomplish the person nel safety. The general grounding problem is to find the optimal locations of these points so that the security and reli ability of power system can be improved. This paper presents a practical approach to find the optimal location of GPs based on the ratios of zero sequence reactance with positive sequence reactance (X0/X1), zero sequence resistance with positive sequence rea...

  4. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  5. Business Statistics and Management Science Online: Teaching Strategies and Assessment of Student Learning

    Science.gov (United States)

    Sebastianelli, Rose; Tamimi, Nabil

    2011-01-01

    Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…

  6. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics

    Science.gov (United States)

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...

  7. Air-chemistry "turbulence": power-law scaling and statistical regularity

    Directory of Open Access Journals (Sweden)

    H.-m. Hsu

    2011-08-01

    Full Text Available With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3 and aerosol (PM10 measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days peaks are also identified with the continuous wavelet transform (CWT. The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of −5/3 and −1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, −3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.

  8. Specification of life cycle assessment in nuclear power plants

    International Nuclear Information System (INIS)

    Life Cycle Assessment is an environmental management tool for assessing the environmental impacts of a product of a process. life cycle assessment involves the evaluation of environmental impacts through all stages of life cycle of a product or process. In other words life cycle assessment has a cradle to graveapproach. Some results of life cycle assessment consist of pollution prevention, energy efficient system, material conservation, economic system and sustainable development. All power generation technologies affect the environment in one way or another. The main environmental impact does not always occur during operation of power plant. The life cycle assessment of nuclear power has entailed studying the entire fuel cycle from mine to deep repository, as well as the construction, operation and demolition of the power station. Nuclear power plays an important role in electricity production for several countries. even though the use of nuclear power remains controversial. But due to the shortage of fossil fuel energy resources many countries have started to try more alternation to their sources of energy production. A life cycle assessment could detect all environmental impacts of nuclear power from extracting resources, building facilities and transporting material through the final conversion to useful energy services

  9. Knowledge based system for fouling assessment of power plant boiler

    International Nuclear Information System (INIS)

    The paper presents the design of an expert system for fouling assessment in power plant boilers. It is an on-line expert system based on selected criteria for the fouling assessment. Using criteria for fouling assessment based on 'clean' and 'not-clean' radiation heat flux measurements, the diagnostic variable are defined for the boiler heat transfer surface. The development of the prototype knowledge-based system for fouling assessment in power plants boiler comprise the integrations of the elements including knowledge base, inference procedure and prototype configuration. Demonstration of the prototype knowledge-based system for fouling assessment was performed on the Sines power plant. It is a 300 MW coal fired power plant. 12 fields are used with 3 on each side of boiler

  10. Safety Assessment - Swedish Nuclear Power Plants

    International Nuclear Information System (INIS)

    After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs

  11. Safety Assessment - Swedish Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kjellstroem, B. [Luleaa Univ. of Technology (Sweden)

    1996-12-31

    After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs.

  12. Methods of assessing nuclear power plant risks

    International Nuclear Information System (INIS)

    The concept of safety evalution is based on safety criteria -standards or set qualitative values of parameters and indices used in designing nuclear power plants, incorporating demands on the quality of equipment and operation of the plant, its siting and technical means for achieving nuclear safety. The concepts are presented of basic and optimal risk values. Factors are summed up indispensable for the evaluation of the nuclear power plant risk and the present world trend of evaluation based on probability is discussed. (J.C.)

  13. Observer variability in the assessment of type and dysplasia of colorectal adenomas, analyzed using kappa statistics

    DEFF Research Database (Denmark)

    Jensen, P; Krogsgaard, M R; Christiansen, J;

    1995-01-01

    PURPOSE: The aim of this study was to establish the intraobserver and interobserver variability in the assessment of histologic type (tubular, villous, and tubulovillous) and grade of cytologic dysplasia (mild, moderate, and severe) in colorectal adenomas. METHODS: One hundred eighty-seven slides...... of adenomas were assessed twice by three experienced pathologists, with an interval of two months. Results were analyzed using kappa statistics. RESULTS: For agreement between first and second assessment (both type and grade of dysplasia), kappa values for the three specialists were 0.5345, 0...

  14. Statistical analysis of regional capital and operating costs for electric power generation

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, L.R.; Myers, M.G.; Herrman, J.A.; Provanizano, A.J.

    1977-10-01

    This report presents the results of a three and one-half-month study conducted for Brookhaven National Lab. to develop capital and operating cost relationships for seven electric power generating technologies: oil-, coal-, gas-, and nuclear-fired steam-electric plants, hydroelectric plants, and gas-turbine plants. The methodology is based primarily on statistical analysis of Federal Power Commission data for plant construction and annual operating costs. The development of cost-output relationships for electric power generation is emphasized, considering the effects of scale, technology, and location on each of the generating processes investigated. The regional effects on cost are measured at the Census Region level to be consistent with the Brookhaven Multi-Regional Energy and Interindustry Regional Model of the United States. Preliminary cost relationships for system-wide costs - transmission, distribution, and general expenses - were also derived. These preliminary results cover the demand for transmission and distribution capacity and operating and maintenance costs in terms of system-service characteristics. 15 references, 6 figures, 23 tables.

  15. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  16. Several High Level Issues in Reliability Assessment of Safety-Critical Software in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Man Cheol; Jang, Seung Cheol [KAERI, Daejeon (Korea, Republic of)

    2011-08-15

    For the purpose of developing a consensus method for the reliability assessment of safety-critical digital instrumentation and control systems in nuclear power plants, several high level issues in reliability assessment of the safety-critical software based on Bayesian belief network modeling and statistical testing are discussed. Related to the Bayesian belief network modeling, the relation between the assessment approach and the sources of evidence, the relation between qualitative evidence and quantitative evidence, and how to consider qualitative evidence are discussed. Related to the statistical testing, the need of the consideration of context-specific software failure probabilities and the inability to perform a huge number of tests in the real world are discussed. The discussions in this paper are expected to provide a common basis for future discussions on the reliability assessment of safety-critical software.

  17. Microphone array power ratio for quality assessment of reverberated speech

    Science.gov (United States)

    Berkun, Reuven; Cohen, Israel

    2015-12-01

    Speech signals in enclosed environments are often distorted by reverberation and noise. In speech communication systems with several randomly distributed microphones, involving a dynamic speaker and unknown source location, it is of great interest to monitor the perceived quality at each microphone and select the signal with the best quality. Most of existing approaches for quality estimation require prior information or a clean reference signal, which is unfortunately seldom available. In this paper, a practical non-intrusive method for quality assessment of reverberated speech signals is proposed. Using a statistical model of the reverberation process, we examine the energies as measured by unidirectional elements in a microphone array. By measuring the power ratio, we obtain a measure for the amount of reverberation in the received acoustic signals. This measure is then utilized to derive a blind estimation of the direct-to-reverberation energy ratio in the room. The proposed approach attains a simple, reliable, and robust quality measure, shown here through persuasive simulation results.

  18. Data base of accident and agricultural statistics for transportation risk assessment

    International Nuclear Information System (INIS)

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs

  19. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  20. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  1. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  2. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  3. Assessment and financing of electric power projects

    International Nuclear Information System (INIS)

    The aim of the appraisal of a project is to examine the economic need which a project is designed to meet, to judge whether the project is likely to meet this need in an efficient way, and to conclude what conditions should be attached to eventual Bank financing. Bank involvement continues throughout the life of the project helping to ensure that each project is carried out at the least possible cost and that it makes the expected contribution to the country's development. This paper gives an idea about the origin, nature and functions of the World Bank Group, describes the criteria used by the Bank in its power project appraisals, discusses the Bank's views on nuclear power, and concludes with a review of past lending and probable future sources of financing of electrical expansion in the less developed countries. (orig./UA)

  4. TIDAL POWER: Economic and Technological Assessment

    OpenAIRE

    Montllonch Araquistain, Tatiana

    2010-01-01

    At present time there is concern over global climate change, as well as a growing awareness on worldwide population about the need on reducing greenhouse gas emissions. This in fact, has led to an increase in power generation from renewable sources. Tidal energy has the potential to play a valuable role in a sustainable energy future. Its main advantage over other renewable sources is its predictability; tides can be predicted years in advanced. The energy extracted from the tides can come fr...

  5. Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions

    CERN Document Server

    Pisarenko, V F

    2011-01-01

    We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent characterizing the null hypothesis. We demonstrate how to apply these two tests on the distributions of cities and of agglomerations in a number of countries. We find the following evidence for Dragon-Kings: London in the distribution of city sizes of Great Britain; Moscow and St-Petersburg in the distribution of city sizes in the Russian Federation; and Paris in the distribution of agglomeration sizes in France. True negatives are also reported, for instance the absence of Dragon-Kings in the distribution of cities in Ger...

  6. Decision tree approach to power systems security assessment

    OpenAIRE

    Wehenkel, Louis; Pavella, Mania

    1993-01-01

    An overview of the general decision tree approach to power system security assessment is presented. The general decision tree methodology is outlined, modifications proposed in the context of transient stability assessment are embedded, and further refinements are considered. The approach is then suitably tailored to handle other specifics of power systems security, relating to both preventive and emergency voltage control, in addition to transient stability. Trees are accordingly built in th...

  7. A statistical model for seismic hazard assessment of hydraulic-fracturing-induced seismicity

    Science.gov (United States)

    Hajati, T.; Langenbruch, C.; Shapiro, S. A.

    2015-12-01

    We analyze the interevent time distribution of hydraulic-fracturing-induced seismicity collected during 18 stages at four different regions. We identify a universal statistical process describing the distribution of hydraulic-fracturing-induced events in time. The distribution of waiting times between subsequently occurring events is given by the exponential probability density function of the homogeneous Poisson process. Our findings suggest that hydraulic-fracturing-induced seismicity is directly triggered by the relaxation of stress and pore pressure perturbation initially created by the injection. Therefore, compared to this relaxation, the stress transfer caused by the occurrence of preceding seismic events is mainly insignificant for the seismogenesis of subsequently occurring events. We develop a statistical model to compute the occurrence probability of hydraulic-fracturing-induced seismicity. This model can be used to assess the seismic hazard associated with hydraulic fracturing operations. No aftershock triggering has to be included in the statistical model.

  8. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.; Jensen, Bogi Bech; Mijatovic, Nenad; Holbøll, Joachim

    2015-01-01

    A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from the ke...

  9. Transient stability risk assessment of power systems incorporating wind farms

    DEFF Research Database (Denmark)

    Miao, Lu; Fang, Jiakun; Wen, Jinyu;

    2013-01-01

    Large-scale wind farm integration has brought several aspects of challenges to the transient stability of power systems. This paper focuses on the research of the transient stability of power systems incorporating with wind farms by utilizing risk assessment methods. The detailed model of double ...

  10. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte; Sørensen, Poul

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  11. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.; Sørensen, P.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  12. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.; Sørensen, Poul Ejnar

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  13. Statistical analysis of wind power in the region of Veracruz (Mexico)

    Energy Technology Data Exchange (ETDEWEB)

    Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)

    2009-06-15

    The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)

  14. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hacke, P.; Spataru, S.

    2014-08-01

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.

  15. Probabilistic safety assessment for optimum nuclear power plant life management (PLiM) theory and application of reliability analysis methods for major power plant components

    CERN Document Server

    Arkadov, G V; Rodionov, A N

    2012-01-01

    Probabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Directing preventative maintenance, this title provides a comprehensive review of the theory and application of these methods.$bProbabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Successful calculation of the reliability and ageing of components is critical for forecasting safety and directing preventative maintenance, and Probabilistic safety assessment for optimum nuclear power plant life management provides a comprehensive review of the theory and application of these methods. Part one reviews probabilistic methods for predicting the reliability of equipment. Following an introduction to key terminology, concepts and definitions, formal-statistical and various physico-statistical approaches are discussed. Approaches based on the use of defect-free models are considered, along with those using binomial distribution and models bas...

  16. Nuclear power plant performance statistics. Comparison with fossil-fired units

    International Nuclear Information System (INIS)

    The joint UNIPEDE/World Energy Conference Committee on Availability of Thermal Generating Plants has a mandate to study the availability of thermal plants and the different factors that influence it. This has led to the collection and publication at the Congress of the World Energy Conference (WEC) every third year of availability and unavailability factors to be used in systems reliability studies and operations and maintenance planning. For nuclear power plants the joint UNIPEDE/WEC Committee relies on the IAEA to provide availability and unavailability data. The IAEA has published an annual report with operating data from nuclear plants in its Member States since 1971, covering in addition back data from the early 1960s. These reports have developed over the years and in the early 1970s the format was brought into close conformity with that used by UNIPEDE and WEC to report performance of fossil-fired generating plants. Since 1974 an annual analytical summary report has been prepared. In 1981 all information on operating experience with nuclear power plants was placed in a computer file for easier reference. The computerized Power Reactor Information System (PRIS) ensures that data are easily retrievable and at its present level it remains compatible with various national systems. The objectives for the IAEA data collection and evaluation have developed significantly since 1970. At first, the IAEA primarily wanted to enable the individual power plant operator to compare the performance of his own plant with that of others of the same type; when enough data had been collected, they provided the basis for assessment of the fundamental performance parameters used in economic project studies; now, the data base merits being used in setting availability objectives for power plant operations. (author)

  17. Blind image quality assessment using statistical independence in the divisive normalization transform domain

    Science.gov (United States)

    Chu, Ying; Mou, Xuanqin; Fu, Hong; Ji, Zhen

    2015-11-01

    We present a general purpose blind image quality assessment (IQA) method using the statistical independence hidden in the joint distributions of divisive normalization transform (DNT) representations for natural images. The DNT simulates the redundancy reduction process of the human visual system and has good statistical independence for natural undistorted images; meanwhile, this statistical independence changes as the images suffer from distortion. Inspired by this, we investigate the changes in statistical independence between neighboring DNT outputs across the space and scale for distorted images and propose an independence uncertainty index as a blind IQA (BIQA) feature to measure the image changes. The extracted features are then fed into a regression model to predict the image quality. The proposed BIQA metric is called statistical independence (STAIND). We evaluated STAIND on five public databases: LIVE, CSIQ, TID2013, IRCCyN/IVC Art IQA, and intentionally blurred background images. The performances are relatively high for both single- and cross-database experiments. When compared with the state-of-the-art BIQA algorithms, as well as representative full-reference IQA metrics, such as SSIM, STAIND shows fairly good performance in terms of quality prediction accuracy, stability, robustness, and computational costs.

  18. Evaluation of a Regional Monitoring Program's Statistical Power to Detect Temporal Trends in Forest Health Indicators

    Science.gov (United States)

    Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.

    2014-09-01

    Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.

  19. Application of advanced statistical methods in assessment of the late phase of a nuclear accident

    Czech Academy of Sciences Publication Activity Database

    Hofman, Radek

    Praha: ČVUT, 2008, s. 1-4. [Dny radiacni ochrany /30/.. Liptovsky Jan (SK), 10.11.2008-14.11.2008] R&D Projects: GA ČR(CZ) GA102/07/1596 Institutional research plan: CEZ:AV0Z10750506 Keywords : radiation protection Subject RIV: DI - Air Pollution ; Quality http://library.utia.cas.cz/separaty/2008/AS/hofman-application of advanced statistical methods in assessment of the late phase of a nuclear accident.pdf

  20. Multivariable statistical process control to situation assessment of a sequencing batch reactor

    OpenAIRE

    Ruiz Ordóñez, Magda; Colomer, Joan; Colprim, Jesus; Meléndez, Joaquim

    2004-01-01

    In this work, a combination between Multivariate Statistical Process Control (MSPC) and an automatic classification algorithm is developed to application in Waste Water Treatment Plant. Multiway Principal Component Analysis is used as MSPC method. The goal is to create a model that describes the batch direction and helps to fix the limits used to determine abnormal situations. Then, an automatic classification algorithm is used to situation assessment of the process.

  1. Statistical issues in the assessment of health outcomes in children : methodological review.

    OpenAIRE

    Lancaster, Gillian

    2009-01-01

    The lack of outcome measures that are validated for use on children limits the effectiveness and generalizability of paediatric health care interventions. Statistical epidemiology is a broad concept encompassing a wide range of useful techniques for use in child health outcome assessment and development. However, the range of techniques that are available is often confusing and prohibits their adoption. In the paper an overview of methodology is provided within the paediatric context. It is d...

  2. A statistical assessment of population trends for data deficient Mexican amphibians

    OpenAIRE

    Esther Quintero; Thessen, Anne E.; Paulina Arias-Caballero; Bárbara Ayala-Orozco

    2014-01-01

    Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent stud...

  3. Water Quality Assessment of Gufu River in Three Gorges Reservoir (China) Using Multivariable Statistical Methods

    OpenAIRE

    Jiwen Ge; Guihua Ran; Wenjie Miao; Huafeng Cao; Shuyuan Wu; Lamei Cheng

    2013-01-01

    To provide the reasonable basis for scientific management of water resources and certain directive significance for sustaining health of Gufu River and even maintaining the stability of water ecosystem of the Three-Gorge Reservoir of Yangtze River, central China, multiple statistical methods including Cluster Analysis (CA), Discriminant Analysis (DA) and Principal Component Analysis (PCA) were performed to assess the spatial-temporal variations and interpret water quality data. The data were ...

  4. Evaluation and assessment of nuclear power plant seismic methodology

    International Nuclear Information System (INIS)

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology

  5. Power source life cycle assessment by the Bilan Carbone method

    International Nuclear Information System (INIS)

    Bilan Carbone is a method to assess the amount of spent energy in the form of CO2 formation and its impacts on climate change (carbon footprint). The method assesses each steps in power production, finds hidden energy flows for modelling future energy scenarios. The principles of the method are outlined and an example of its application is presented. (orig.)

  6. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  7. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    Directory of Open Access Journals (Sweden)

    Amzal Billy

    2011-02-01

    Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.

  8. Technology assessment Jordan Nuclear Power Plant Project

    International Nuclear Information System (INIS)

    Preliminary regional analysis was carried out for identification of potential sites for NPP, followed by screening of these sites and selecting candidate sites. Aqaba sites are proposed, where it can use the sea water for cooling: i.Site 1; at the sea where it can use the sea water for direct cooling. ii.Site 2; 10 km to the east of Gulf of Aqaba shoreline at the Saudi Arabia borders. iii.Site 3, 4 km to the east of Gulf of Aqaba shoreline. Only the granitic basement in the east of the 6 km²site should be considered as a potential site for a NPP. Preliminary probabilistic seismic hazard assessment gives: Operating-Basis Earthquake-OBE (475 years return period) found to be in the range of 0.163-0.182 g; Safe Shutdown Earthquake-SSE (10,000 years return period) found to be in the range of 0.333-0.502g. The process include also setting up of nuclear company and other organizational matters. Regulations in development are: Site approval; Construction permitting; Overall licensing; Safety (design, construction, training, operations, QA); Emergency planning; Decommissioning; Spent fuel and RW management. JAEC's technology assessment strategy and evaluation methodology are presented

  9. Hybrid algorithm for rotor angle security assessment in power systems

    Directory of Open Access Journals (Sweden)

    D. Prasad Wadduwage

    2015-08-01

    Full Text Available Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on the concept of Lyapunov exponents (LEs to determine the transient security of the post-fault power system. Later, the transient secure power swing curves are analysed using an improved Prony algorithm which extracts the dominant oscillatory modes and estimates their damping ratios. The damping ratio is a security measure about the oscillatory security of the post-fault power system subsequent to the contingency. The suitability of the proposed hybrid algorithm for DSA in power systems is illustrated using different contingencies of a 16-generator 68-bus test system and a 50-generator 470-bus test system. The accuracy of the stability conclusions and the acceptable computational burden indicate that the proposed hybrid algorithm is suitable for real-time security assessment with respect to both transient rotor angle stability and oscillatory rotor angle stability under multiple contingencies of the power system.

  10. National-Scale Wind Resource Assessment for Power Generation (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, E. I.

    2013-08-01

    This presentation describes the current standards for conducting a national-scale wind resource assessment for power generation, along with the risk/benefit considerations to be considered when beginning a wind resource assessment. The presentation describes changes in turbine technology and viable wind deployment due to more modern turbine technology and taller towers and shows how the Philippines national wind resource assessment evolved over time to reflect changes that arise from updated technologies and taller towers.

  11. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  12. Probabilistic safety assessment in nuclear power plant management

    International Nuclear Information System (INIS)

    Probabilistic Safety Assessment (PSA) techniques have been widely used over the past few years to assist in understanding how engineered systems respond to abnormal conditions, particularly during a severe accident. The use of PSAs in the design and operation of such systems thus contributes to the safety of nuclear power plants. Probabilistic safety assessments can be maintained to provide a continuous up-to-date assessment (Living PSA), supporting the management of plant operations and modifications

  13. Method for assessing wind power integration in a hydro based power system

    International Nuclear Information System (INIS)

    The present paper demonstrates a method for assessment of how much wind power that can be integrated in a system with limited transmission capacity. Based on hydro inflow data and wind measurements (for different locations of planned wind farms in an area) it is possible to assess how much wind power that can be fed into a certain point in the transmission network without violating the transmission capacity limits. The proposed method combines the use of market modelling and detailed network analysis in order to assess the probability of network congestions rather than focusing on extreme cases. By computing the probability distribution of power flow on critical corridors in the grid it is possible to assess the likelihood of network congestions and the amount of energy that must be curtailed to fulfil power system security requirements (n-1). This way the assessment is not only made of worst case scenarios, assuming maximal flow from hydro plants and maximal wind power production. As extreme case scenarios are short term and may be solved by market mechanisms or automatic system protection schemes (disconnection of wind power or hydro power), the proposed method may reveal that it would be economic to install more wind power than if only based on analysis of worst case scenarios. (orig.)

  14. Assessment of a satellite power system and six alternative technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L.S.; Levine, E.; Tanzman, E.

    1981-04-01

    The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and insitutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included. (LEW)

  15. Statistics of 150-km echoes over Jicamarca based on low-power VHF observations

    Directory of Open Access Journals (Sweden)

    J. L. Chau

    2006-07-01

    Full Text Available In this work we summarize the statistics of the so-called 150-km echoes obtained with a low-power VHF radar operation at the Jicamarca Radio Observatory (11.97 S, 76.87 W, and 1.3 dip angle at 150-km altitude in Peru. Our results are based on almost four years of observations between August 2001 and July 2005 (approximately 150 days per year. The majority of the observations have been conducted between 08:00 and 17:00 LT. We present the statistics of occurrence of the echoes for each of the four seasons as a function of time of day and altitude. The occurrence frequency of the echoes is ~75% around noon and start decreasing after 15:00 LT and disappear after 17:00 LT in all seasons. As shown in previous campaign observations, the 150-echoes appear at a higher altitude (>150 km in narrow layers in the morning, reaching lower altitudes (~135 km around noon, and disappear at higher altitudes (>150 km after 17:00 LT. We show that although 150-km echoes are observed all year long, they exhibit a clear seasonal variability on altitudinal coverage and the percentage of occurrence around noon and early in the morning. We also show that there is a strong day-to-day variability, and no correlation with magnetic activity. Although our results do not solve the 150-km riddle, they should be taken into account when a reasonable theory is proposed.

  16. Assessing the impact of vaccination programmes on burden of disease: Underlying complexities and statistical methods.

    Science.gov (United States)

    Mealing, Nicole; Hayen, Andrew; Newall, Anthony T

    2016-06-01

    It is important to assess the impact a vaccination programme has on the burden of disease after it is implemented. For example, this may reveal herd immunity effects or vaccine-induced shifts in the incidence of disease or in circulating strains or serotypes of the pathogen. In this article we summarise the key features of infectious diseases that need to be considered when trying to detect any changes in the burden of diseases at a population level as a result of vaccination efforts. We outline the challenges of using routine surveillance databases to monitor infectious diseases, such as the identification of diseased cases and the availability of vaccination status for cases. We highlight the complexities in modelling the underlying patterns in infectious disease rates (e.g. presence of autocorrelation) and discuss the main statistical methods that can be used to control for periodicity (e.g. seasonality) and autocorrelation when assessing the impact of vaccination programmes on burden of disease (e.g. cosinor terms, generalised additive models, autoregressive processes and moving averages). For some analyses, there may be multiple methods that can be used, but it is important for authors to justify the method chosen and discuss any limitations. We present a case study review of the statistical methods used in the literature to assess the rotavirus vaccination programme impact in Australia. The methods used varied and included generalised linear models and descriptive statistics. Not all studies accounted for autocorrelation and seasonality, which can have a major influence on results. We recommend that future analyses consider the strength and weakness of alternative statistical methods and justify their choice. PMID:27156635

  17. Sea cliff instability susceptibility at regional scale: a statistically based assessment in the southern Algarve, Portugal

    Science.gov (United States)

    Marques, F. M. S. F.; Matildes, R.; Redweik, P.

    2013-12-01

    Sea cliff evolution is dominated by the occurrence of slope mass movements of different types and sizes, which are a considerable source of natural hazard, making their assessment a relevant issue in terms of human loss prevention and land use regulations. To address the assessment of the spatial component of sea cliff hazards, i.e. the susceptibility, a statistically based study was made to assess the capacity of a set of conditioning factors to express the occurrence of sea cliff failures affecting areas located along their top. The study was based on the application of the bivariate information value and multivariate logistic regression statistical methods, using a set of predisposing factors for cliff failures, mainly related to geology (lithology, bedding dip, faults) and geomorphology (maximum and mean slope, height, aspect, plan curvature, toe protection), which were correlated with a photogrammetry-based inventory of cliff failures that occurred in a 60 yr period (1947-2007). The susceptibility models were validated against the inventory data using standard success rate and ROC curves, and provided encouraging results, indicating that the proposed approaches are effective for susceptibility assessment. The results obtained also stress the need for improvement of the predisposing factors to be used in this type of study and the need for detailed and systematic cliff failure inventories.

  18. GeneMarker® Genotyping Software: Tools to Increase the Statistical Power of DNA Fragment Analysis

    Science.gov (United States)

    Hulce, D.; Li, X.; Snyder-Leiby, T.; Johathan Liu, C.S.

    2011-01-01

    The discriminatory power of post-genotyping analyses, such as kinship or clustering analysis, is dependent on the amount of genetic information obtained from the DNA fragment/genotyping analysis. The number of microsatellite loci amplified in one multiplex is limited by the number of dyes and overlapping loci boundaries; requiring researchers to amplify replicate samples with 2 or more multiplexes in order to obtain a genotype for 12–15 loci. AFLP is another method that is limited by the number of dyes, often requiring multiple amplifications of replicate samples to obtain more complete results. Traditionally, researchers export the genotyping results into a spread sheet, manually combine the results for each individual and then import into a third software package for post-genotyping analysis. GeneMarker is highly accurate, user-friendly genotyping software that allows all of these steps to be done in one software package, avoiding potential errors from data transfer to different programs and decreasing the amount of time needed to process the results. The Merge Project tool automatically combines the results from replicate samples processed with different primer sets. Replicate animal (diploid) DNA samples were amplified with three different multiplexes, each multiplex provided information on 4–6 loci. The kinship analysis using the merged results provided a 1017 increase in statistical power with a range of 108 when 5 loci were used versus 1025 when 15 loci were used to determine potential relationship levels with identity by descent calculations. These same sample sets were used in clustering analysis to diagram dendrograms. The dendrogram based on a single multiplex resulted in three branches at a given Euclidian distance. In comparison, the dendrogram that was constructed using the merged results had eight branches at the same Euclidian distance.

  19. Probabilistic performance assessment of a coal-fired power plant

    International Nuclear Information System (INIS)

    Highlights: • Power plant equipment is usually oversized to account for input uncertainties. • Oversized equipment degrades its rated efficiency and increases capital cost. • A stochastic methodology to assess probabilities of equipment failure was proposed. • The methodology was proven applicable for design and analysis of the power plants. • Estimated high reliability indices allow reducing power plant equipment oversizing. - Abstract: Despite the low-carbon environmental policies, coal is expected to remain a main source of energy in the coming decades. Therefore, efficient and environmentally friendly power systems are required. A design process based on the deterministic models and application of the safety factors leads to the equipment oversizing, hence fall in the efficiency and increase in the capital and operating costs. In this work, applicability of a non-intrusive stochastic methodology to determine the probability of the power plant equipment failure was investigated. This alternative approach to the power plant performance assessment employs approximation methods for the deterministic prediction of the key performance indicators, which are used to estimate reliability indices based on the uncertainty of the input to a process model of the coal-fired power plant. This study revealed that high reliability indices obtained in the analysis would lead to reduced application of conservative safety factors on the plant equipment, which should result in lower capital and operating cost, through a more reliable assessment of its performance state over its service time, and lead to the optimisation of its inspection and maintenance interventions

  20. No-reference image quality assessment based on nonsubsample shearlet transform and natural scene statistics

    Science.gov (United States)

    Wang, Guan-jun; Wu, Zhi-yong; Yun, Hai-jiao; Cui, Ming

    2016-03-01

    A novel no-reference (NR) image quality assessment (IQA) method is proposed for assessing image quality across multifarious distortion categories. The new method transforms distorted images into the shearlet domain using a non-subsample shearlet transform (NSST), and designs the image quality feature vector to describe images utilizing natural scenes statistical features: coefficient distribution, energy distribution and structural correlation ( SC) across orientations and scales. The final image quality is achieved from distortion classification and regression models trained by a support vector machine (SVM). The experimental results on the LIVE2 IQA database indicate that the method can assess image quality effectively, and the extracted features are susceptive to the category and severity of distortion. Furthermore, our proposed method is database independent and has a higher correlation rate and lower root mean squared error ( RMSE) with human perception than other high performance NR IQA methods.

  1. Wind Power Assessment Based on a WRF Wind Simulation with Developed Power Curve Modeling Methods

    Directory of Open Access Journals (Sweden)

    Zhenhai Guo

    2014-01-01

    Full Text Available The accurate assessment of wind power potential requires not only the detailed knowledge of the local wind resource but also an equivalent power curve with good effect for a local wind farm. Although the probability distribution functions (pdfs of the wind speed are commonly used, their seemingly good performance for distribution may not always translate into an accurate assessment of power generation. This paper contributes to the development of wind power assessment based on the wind speed simulation of weather research and forecasting (WRF and two improved power curve modeling methods. These approaches are improvements on the power curve modeling that is originally fitted by the single layer feed-forward neural network (SLFN in this paper; in addition, a data quality check and outlier detection technique and the directional curve modeling method are adopted to effectively enhance the original model performance. The proposed two methods, named WRF-SLFN-OD and WRF-SLFN-WD, are able to avoid the interference from abnormal output and the directional effect of local wind speed during the power curve modeling process. The data examined are from three stations in northern China; the simulation indicates that the two developed methods have strong abilities to provide a more accurate assessment of the wind power potential compared with the original methods.

  2. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  3. Assessment of the statistics of the Strehl ratio: predictions of central limit theorem analysis

    Science.gov (United States)

    Tyler, Glenn A.

    2006-11-01

    For a beam propagating through turbulence, the statistics of the Strehl ratio are determined by recognizing that the real and imaginary parts of the on-axis far-field pattern can be represented as the sum of many contributions from the aperture. With this in mind, the central limit theorem (CLT) can be used to develop the statistics of the real and imaginary parts of the optical field, which through the appropriate mathematical manipulations as described here can then be used to develop the probability distribution of the far-field irradiance. The results obtained in this way (which we call the CLT theory or analysis) provide an analytic expression that agrees with the results of detailed wave-optics simulations. This provides an approach by which the statistics of the Strehl ratio can be rapidly determined. A key feature of this work is that the analytic results depend on the values of a few relevant turbulence parameters that include r0,fG, and σ2l. Therefore, a measurement of these parameters at various sites of interest allows us to rapidly assess the detailed nature of the statistical fluctuations of the far-field irradiance that will be experienced at these locations.

  4. Blind image quality assessment using joint statistics of gradient magnitude and Laplacian features.

    Science.gov (United States)

    Xue, Wufeng; Mou, Xuanqin; Zhang, Lei; Bovik, Alan C; Feng, Xiangchu

    2014-11-01

    Blind image quality assessment (BIQA) aims to evaluate the perceptual quality of a distorted image without information regarding its reference image. Existing BIQA models usually predict the image quality by analyzing the image statistics in some transformed domain, e.g., in the discrete cosine transform domain or wavelet domain. Though great progress has been made in recent years, BIQA is still a very challenging task due to the lack of a reference image. Considering that image local contrast features convey important structural information that is closely related to image perceptual quality, we propose a novel BIQA model that utilizes the joint statistics of two types of commonly used local contrast features: 1) the gradient magnitude (GM) map and 2) the Laplacian of Gaussian (LOG) response. We employ an adaptive procedure to jointly normalize the GM and LOG features, and show that the joint statistics of normalized GM and LOG features have desirable properties for the BIQA task. The proposed model is extensively evaluated on three large-scale benchmark databases, and shown to deliver highly competitive performance with state-of-the-art BIQA models, as well as with some well-known full reference image quality assessment models. PMID:25216482

  5. The Development of Statistics Textbook Supported with ICT and Portfolio-Based Assessment

    Science.gov (United States)

    Hendikawati, Putriaji; Yuni Arini, Florentina

    2016-02-01

    This research was development research that aimed to develop and produce a Statistics textbook model that supported with information and communication technology (ICT) and Portfolio-Based Assessment. This book was designed for students of mathematics at the college to improve students’ ability in mathematical connection and communication. There were three stages in this research i.e. define, design, and develop. The textbooks consisted of 10 chapters which each chapter contains introduction, core materials and include examples and exercises. The textbook developed phase begins with the early stages of designed the book (draft 1) which then validated by experts. Revision of draft 1 produced draft 2 which then limited test for readability test book. Furthermore, revision of draft 2 produced textbook draft 3 which simulated on a small sample to produce a valid model textbook. The data were analysed with descriptive statistics. The analysis showed that the Statistics textbook model that supported with ICT and Portfolio-Based Assessment valid and fill up the criteria of practicality.

  6. Contribution to the power distribution methodology uncertainties assessment

    International Nuclear Information System (INIS)

    The present methodology of safety margins in NPP Dukovany design power distribution calculations is based on the philosophy of engineering factors with errors defined on the bases of statistical approach of standard (95%) confidence intervals. On the level of FA power distribution the normality (normal density distribution) of this approach is tested and comparison with errors defined on the 95-percent probability at a 95-percent confidence level (shortly in statistics 95%/95%)) is provided. Practical applications are presented for several NPP Dukovany fuel cycles. The paper also deals briefly with difference between confidence interval and tolerance interval, with the problems of density distribution of mechanical engineering factor variables and solution of axial and radial error distribution like bivariate problem. (Author)

  7. Batch estimation of statistical errors in the Monte Carlo calculation of local powers

    International Nuclear Information System (INIS)

    Highlights: → Batch methodology performs well on the practical grounds. → The sample variance without autocorrelation terms is utterly unacceptable. → Non-overlapping and overlapping batch means perform better than standardized time series. → Overlapping batch means can be improved based on autocovariance bias correction without the cost of instability. - Abstract: Batch methodology is among the techniques for computing the standard deviation of sample mean and is applicable to any output series from stationary iteration cycles. In the present article, three forms of the methodology are investigated: non-overlapping batch means (NBM), which dates back to , overlapping batch means (OBM) by , and standardized time series (STS) by . In particular, they are applied to the MC calculation of local powers of a pressurized water reactor. The numerical results reveal that the performance of NBM is equivalent to that of OBM, whereas STS performs poorly for small batch sizes. It is also shown that OBM can be improved based on the method of autocovariance bias correction. For a computational condition leading to 0.5-1.5% statistical errors, the improved OBM for a batch size of 10% of the stationary iteration cycle length yields 88-103% of the reference value of standard deviation at tally cells where the sample standard deviation yields 22-36% of the same reference value.

  8. Developing a PQ monitoring system for assessing power quality and critical areas detection

    Directory of Open Access Journals (Sweden)

    Miguel Romero

    2011-10-01

    Full Text Available This paper outlines the development of a power quality monitoring system. The system is aimed at assessing power quality and detecting critical areas throughout at distribution system. Such system integrates a hardware system and a software processing tool developed in four main stages. Power quality disturbances are registered by PQ meters and the data is transmitted through a 3G wireless network. This data is processed and filtered in an open source database. Some interesting statistical indices related to voltage sags, swells, flicker and voltage unbalance are obtained. The last stage displays the indices geo-referenced on power quality maps, allowing the identification of critical areas according to different criteria. The results can be analyzed using clustering tools to identify differentiated quality groups in a city. The proposed system is an open source tool useful to electricity utilities to analyze and manage large amount of data.

  9. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics.

    Science.gov (United States)

    Pleil, Joachim D

    2016-01-01

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, exceedance levels and percentiles. Such comparisons are only valid if the underlying assumptions of distribution are correct. This article discusses methodology for interpreting and evaluating data distributions using quartile-quartile plots (QQ-plots) and making decisions as to how to treat outliers, interpreting effects of mixed distributions, and identifying left-censored data. The QQ-plot graph is shown to be a simple and elegant tool for visual inspection of complex data and deciding if summary statistics should be performed after log-transformation. PMID:27491525

  10. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  11. The Application of Visual Saliency Models in Objective Image Quality Assessment: A Statistical Evaluation.

    Science.gov (United States)

    Zhang, Wei; Borji, Ali; Wang, Zhou; Le Callet, Patrick; Liu, Hantao

    2016-06-01

    Advances in image quality assessment have shown the potential added value of including visual attention aspects in its objective assessment. Numerous models of visual saliency are implemented and integrated in different image quality metrics (IQMs), but the gain in reliability of the resulting IQMs varies to a large extent. The causes and the trends of this variation would be highly beneficial for further improvement of IQMs, but are not fully understood. In this paper, an exhaustive statistical evaluation is conducted to justify the added value of computational saliency in objective image quality assessment, using 20 state-of-the-art saliency models and 12 best-known IQMs. Quantitative results show that the difference in predicting human fixations between saliency models is sufficient to yield a significant difference in performance gain when adding these saliency models to IQMs. However, surprisingly, the extent to which an IQM can profit from adding a saliency model does not appear to have direct relevance to how well this saliency model can predict human fixations. Our statistical analysis provides useful guidance for applying saliency models in IQMs, in terms of the effect of saliency model dependence, IQM dependence, and image distortion dependence. The testbed and software are made publicly available to the research community. PMID:26277009

  12. Use assessment of electronic power sources for SMAW

    OpenAIRE

    Scotti, A.; Gomes, M.; Pereira, J

    1999-01-01

    The aim of the present work was to assess the efficacy of the use of modern technologies for power supplies in Shielded Metal Are Welding (SMAW). Coupon tests were welded by using a series of five different classes of commercial electrodes, covering their current ranges. Both a conventional electromagnetic and an electronic (inverter) power sources were employed. Fusion rate, deposition efficiency, bead finish and weld geometry were measured at each experiment. Current and voltage signals wer...

  13. Assessing the Short-term Forecasting Power of Confidence Indices

    OpenAIRE

    Euler Pereira G. de Mello; Francisco Marcos R. Figueiredo

    2014-01-01

    This paper assesses the predictive power of the main confidence índices available in Brazil to forecast economic activity. More specifically, we consider a set of economic activity variables and, for each of those, compare the predictive power of a univariate autoregressive model to that of a similar model that includes confidence index. Preliminary results using the Diebold Mariano test suggest that the Industry Confidence Index (ICI) provides relevant information, for both present and the n...

  14. Mathematical Safety Assessment Approaches for Thermal Power Plants

    OpenAIRE

    Zong-Xiao Yang; Lei Song; Chun-Yang Zhang; Chong Li; Xiao-Bo Yuan

    2014-01-01

    How to use system analysis methods to identify the hazards in the industrialized process, working environment, and production management for complex industrial processes, such as thermal power plants, is one of the challenges in the systems engineering. A mathematical system safety assessment model is proposed for thermal power plants in this paper by integrating fuzzy analytical hierarchy process, set pair analysis, and system functionality analysis. In the basis of those, the key factors in...

  15. Transient Stability Assessment of Power System with Large Amount of Wind Power Penetration

    DEFF Research Database (Denmark)

    Liu, Leo; Chen, Zhe; Bak, Claus Leth;

    2012-01-01

    Recently, the security and stability of power system with large amount of wind power are the concerned issues, especially the transient stability. In Denmark, the onshore and offshore wind farms are connected to distribution system and transmission system respectively. The control and protection...... methodologies of onshore and offshore wind farms definitely affect the transient stability of power system. In this paper, the onshore and offshore wind farms are modeled in detail in order to assess the transient stability of western Danish power system. Further, the computation of critical clearing time (CCT......) in different scenarios is proposed to evaluate the vulnerable areas in western Danish power system. The result of CCTs in different scenarios can evaluate the impact of wind power on power system transient stability. Besides, some other influencing factors such as the load level of generators in...

  16. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  17. A Tsunami Fragility Assessment for Nuclear Power Plants in Korea

    International Nuclear Information System (INIS)

    Although Tsunami events were defined as an external event in 'PRA Procedure Guide (NUREG/CR- 2300)'after 1982, a Tsunami event was not considered in a design and construction of NPP before the Sumatra earthquake in 2004. But the Madras Atomic Power Station, a commercial nuclear power plant owned and operated by the Nuclear Power Corporation of India Limited (NPCIL), and located near Chennai, India, was affected by the tsunami generated by the 2004 Sumatra earthquake (USNRC 2008). The condenser cooling pumps of Unit 2 of the installation were affected due to flooding of the pump house and subsequent submergence of the seawater pumps by tsunami waves. The turbine was tripped and the reactor shut down. The unit was brought to a cold-shutdown state, and the shutdown-cooling systems were reported as operating safely. After this event, Tsunami hazards were considered as one of the major natural disasters which can affect the safety of Nuclear Power Plants. The IAEA performed an Extrabudgetary project for Tsunami Hazard Assessment and finally an International Seismic Safety Center (ISSC) established in IAEA for protection from natural disasters like earthquake, tsunami etc. For this reason, a tsunami hazard assessment method determined in this study. At first, a procedure for tsunami hazard assessment method was established, and second target equipment and structures for investigation of Tsunami Hazard assessment were selected. Finally, a sample fragility calculation was performed for one of equipment in Nuclear Power Plant

  18. Statistics of the Chi-Square Type, with Application to the Analysis of Multiple Time-Series Power Spectra

    CERN Document Server

    Sturrock, P A

    2003-01-01

    It is often necessary to compare the power spectra of two or more time series: one may, for instance, wish to estimate what the power spectrum of the combined data sets might have been, or one may wish to estimate the significance of a particular peak that shows up in two or more power spectra. Also, one may occasionally need to search for a complex of peaks in a single power spectrum, such as a fundamental and one or more harmonics, or a fundamental plus sidebands, etc. Visual inspection can be revealing, but it can also be misleading. This leads one to look for one or more ways of forming statistics, which readily lend themselves to significance estimation, from two or more power spectra. The familiar chi-square statistic provides a convenient mechanism for combining variables drawn from normal distributions, and one may generalize the chi-square statistic to be any function of any number of variables with arbitrary distributions. In dealing with power spectra, we are interested mainly in exponential distri...

  19. Passive system reliability in the nuclear power plants (NPPs) using statistical modeling

    International Nuclear Information System (INIS)

    The probabilistic safety assessment (PSA) has been studied for the very high temperature reactor (VHTR). There is a difficulty to make the quantification of the PSA due to the deficiency of the operation and experience data. So, it is necessary to use the statistical data for the basic event. The physical data of the non-linear fuzzy set algorithm are used to quantify the designed case. The mass flow rate in natural circulation is investigated. In addition, the potential energy in the gravity, the temperature and pressure in the heat conduction, and the heat transfer rate in the internal stored energy are investigated. The values in the probability set and fuzzy set are compared for the failure explanation. The result shows how to use the newly made probability of the failure in the propagations. The failure frequencies, which are made by the GAMMA (GAs Multi-component Mixture Analysis) code, are compared with four failure frequencies by probabilistic and fuzzy methods. The results show that the artificial intelligence analysis of the fuzzy set could improve the reliability method than that of the probabilistic analysis.

  20. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  1. Testbeds for Assessing Critical Scenarios in Power Control Systems

    Science.gov (United States)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  2. Economic assessment group on power transmission and distribution networks tariffs

    International Nuclear Information System (INIS)

    Facing the new law on the electric power market liberalization, the french government created an experts group to analyze solutions and assessment methods of the electrical networks costs and tariffs and to control their efficiency. This report presents the analysis and the conclusions of the group. It concerns the three main subjects: the regulation context, the tariffing of the electric power transmission and distribution (the cost and efficiency of the various options) and the tariffing of the electric power supply to the eligible consumers. The authors provide a guideline for a tariffing policy. (A.L.B.)

  3. Seismic design of nuclear power plants. An assessment. Final report

    International Nuclear Information System (INIS)

    A review and evaluation of the analytical methods, design methods, and design standards used in the seismic design of nuclear power plants are presented. Three major areas were investigated: (a) soils, siting, and seismic ground motion specification; (b) soil-structure interaction; (c) the response of major nuclear power plant structures and components. The purpose of this review and evaluation program was to prepare an independent assessment of the state-of-the-art of the seismic design of nuclear power plants and to identify seismic analysis and design research areas meriting support by the various organizations comprising the 'nuclear power industry'. Criteria used for evaluating the relative importance of alternative research areas included the potential research impact on nuclear power plant siting, design, construction, cost, safety, licensing, and regulation

  4. Implementation of a Model Output Statistics based on meteorological variable screening for short‐term wind power forecast

    DEFF Research Database (Denmark)

    Ranaboldo, Matteo; Giebel, Gregor; Codina, Bernat

    2013-01-01

    A combination of physical and statistical treatments to post‐process numerical weather predictions (NWP) outputs is needed for successful short‐term wind power forecasts. One of the most promising and effective approaches for statistical treatment is the Model Output Statistics (MOS) technique....... In this study, a MOS based on multiple linear regression is proposed: the model screens the most relevant NWP forecast variables and selects the best predictors to fit a regression equation that minimizes the forecast errors, utilizing wind farm power output measurements as input. The performance of the method...... is evaluated in two wind farms, located in different topographical areas and with different NWP grid spacing. Because of the high seasonal variability of NWP forecasts, it was considered appropriate to implement monthly stratified MOS. In both wind farms, the first predictors were always wind speeds (at...

  5. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  6. Statistical properties of radiation power levels from a high-gain free-electron laser at and beyond saturation

    International Nuclear Information System (INIS)

    We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond first saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuation level reaches a minimum

  7. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  8. A study on the reliability and risk assessment of nuclear power plants

    International Nuclear Information System (INIS)

    The final objective of the present study is to establish the foundations of both performing statistical analyses of various failures and potential accidents in nuclear power plants and assessing probabilistic safety. In order to achive the objective, we have chosen the review of the state of the arts in the related methodologies and the establishment of reliability analysis method as study areas in this year. The works performed are summalized here. First, the brief reviews of the present status on the Probabilistic Risk Assessment and the development of quantiative safety goal in the Unite States were completed. It has been identified that the Probabilistic Risk Assessment techniques will take important role in nuclear safety assessment as a supporting tool in coming years. In order to establish reliability analysis methodology, a computer code for updating plant specific reliability data has been developed as a part of this project. A reliability analysis system has been estabished and used to analyize the axiliary feedwater system. (Author)

  9. Evaluating the statistical power of detecting changes in the abundance of seabirds at sea

    Energy Technology Data Exchange (ETDEWEB)

    Burton, Niall; Maclean, Ilya; Rehfisch, Mark; Skov, Henrik; Thaxter, Chris

    2011-07-01

    Full text: Offshore wind farms may potentially affect bird populations through the displacement of birds due to the disturbance associated with developments, the barrier they present for migrating birds and birds commuting between breeding and feeding areas, habitat change/loss and collision mortality. In current impact assessments it is often assumed that all birds that use the area of a proposed offshore wind farm would be displaced following construction, with some birds also displaced from a surrounding buffer zone. However, the extent to which current monitoring schemes are capable of detecting changes in abundance and options for improving survey protocols have received little attention. We investigated the likelihood of detecting changes in seabird numbers in UK offshore waters. Using aerial survey data, we simulated 50%, 25% and 10% declines and conducted power analyses to determine the probability that such changes could be detected. Additionally, increases in the duration and frequency of surveying were simulated and the influence of spatial scale and variability in bird numbers were also investigated. Current monitoring schemes do not provide adequate means of detecting changes in numbers even when declines are in excess of 50% and assumptions regarding certainty are relaxed to less than 80%. Extending the duration and frequency of surveys would increase the probability of detecting changes, but not to a desirable level. The primary reason why there is a low probability of being able to detect consistent changes is because seabirds are inherently prone to fluctuations in numbers. Explaining some of the variability in bird numbers using environmental and hydro-dynamic co variates would increase the power of detecting changes. (Author)

  10. Probabilistic safety assessments of nuclear power plants for low power and shutdown modes

    International Nuclear Information System (INIS)

    Within the past several years the results of nuclear power plant operating experience and performance of probabilistic safety assessments (PSAs) for low power and shutdown operating modes have revealed that the risk from operating modes other than full power may contribute significantly to the overall risk from plant operations. These early results have led to an increased focus on safety during low power and shutdown operating modes and to an increased interest of many plant operators in performing shutdown and low power PSAs. This publication was developed to provide guidance and insights on the performance of PSA for shutdown and low power operating modes. The preparation of this publication was initiated in 1994. Two technical consultants meetings were conducted in 1994 and one in February 1999 in support of the development of this report

  11. Global uncertainty assessment in hydrological forecasting by means of statistical analysis of forecast errors

    Science.gov (United States)

    Montanari, A.; Grossi, G.

    2007-12-01

    It is well known that uncertainty assessment in hydrological forecasting is a topical issue. Already in 1905 W.E. Cooke, who was issuing daily weather forecasts in Australia, stated: "It seems to me that the condition of confidence or otherwise form a very important part of the prediction, and ought to find expression". Uncertainty assessment in hydrology involves the analysis of multiple sources of error. The contribution of these latter to the formation of the global uncertainty cannot be quantified independently, unless (a) one is willing to introduce subjective assumptions about the nature of the individual error components or (2) independent observations are available for estimating input error, model error, parameter error and state error. An alternative approach, that is applied in this study and still requires the introduction of some assumptions, is to quantify the global hydrological uncertainty in an integrated way, without attempting to quantify each independent contribution. This methodology can be applied in situations characterized by limited data availability and therefore is gaining increasing attention by end users. This work aims to propose a statistically based approach for assessing the global uncertainty in hydrological forecasting, by building a statistical model for the forecast error xt,d, where t is the forecast time and d is the lead time. Accordingly, the probability distribution of xt,d is inferred through a non linear multiple regression, depending on an arbitrary number of selected conditioning variables. These include the current forecast issued by the hydrological model, the past forecast error and internal state variables of the model. The final goal is to indirectly relate the forecast error to the sources of uncertainty, through a probabilistic link with the conditioning variables. Any statistical model is based on assumptions whose fulfilment is to be checked in order to assure the validity of the underlying theory. Statistical

  12. Radiation fields and dose assessments in Korean nuclear power plants.

    Science.gov (United States)

    Kim, Hee Geun; Kong, Tae Young; Jeong, Woo Tae; Kim, Seok Tae

    2011-07-01

    In the primary systems of nuclear power plants (NPPs), various radionuclides including fission products and corrosion products are generated due to the complex water chemistry conditions. In particular, (3)H, (14)C, (58)Co, (60)Co, (137)Cs, and (131)I are important or potential radionuclides with respect to dose assessment for workers and the management of radioactive effluents or dose assessment for the public. In this paper, the dominant contributors to the dose for workers and the public were reviewed and the process of dose assessment attributable to those contributors was investigated. Furthermore, an analysis was carried out on some examples of dose to workers during NPP operation. PMID:21498858

  13. Preliminary regulatory assessment of nuclear power plants vulnerabilities

    International Nuclear Information System (INIS)

    Preliminary attempts to develop models for nuclear regulatory vulnerability assessment of nuclear power plants are presented. Development of the philosophy and computer tools could be new and important insight for management of nuclear operators and nuclear regulatory bodies who face difficult questions about how to assess the vulnerability of nuclear power plants and other nuclear facilities to external and internal threats. In the situation where different and hidden threat sources are dispersed throughout the world, the assessment of security and safe operation of nuclear power plants is very important. Capability to evaluate plant vulnerability to different kinds of threats, like human and natural occurrences and terrorist attacks and preparation of emergency response plans and estimation of costs are of vital importance for assurance of national security. On the basis of such vital insights, nuclear operators and nuclear regulatory bodies could plan and optimise changes in oversight procedures, organisations, equipment, hardware and software to reduce risks taking into account security and safety of nuclear power plants operation, budget, manpower, and other limitations. Initial qualitative estimations of adapted assessments for nuclear applications are shortly presented. (author)

  14. Performance evaluation of hydrological models: Statistical significance for reducing subjectivity in goodness-of-fit assessments

    Science.gov (United States)

    Ritter, Axel; Muñoz-Carpena, Rafael

    2013-02-01

    similar goodness-of-fit indicators but distinct statistical interpretation, and others to analyze the effects of outliers, model bias and repeated data. This work does not intend to dictate rules on model goodness-of-fit assessment. It aims to provide modelers with improved, less subjective and practical model evaluation guidance and tools.

  15. Assessing Fire Weather Index using statistical downscaling and spatial interpolation techniques in Greece

    Science.gov (United States)

    Karali, Anna; Giannakopoulos, Christos; Frias, Maria Dolores; Hatzaki, Maria; Roussos, Anargyros; Casanueva, Ana

    2013-04-01

    Forest fires have always been present in the Mediterranean ecosystems, thus they constitute a major ecological and socio-economic issue. The last few decades though, the number of forest fires has significantly increased, as well as their severity and impact on the environment. Local fire danger projections are often required when dealing with wild fire research. In the present study the application of statistical downscaling and spatial interpolation methods was performed to the Canadian Fire Weather Index (FWI), in order to assess forest fire risk in Greece. The FWI is used worldwide (including the Mediterranean basin) to estimate the fire danger in a generalized fuel type, based solely on weather observations. The meteorological inputs to the FWI System are noon values of dry-bulb temperature, air relative humidity, 10m wind speed and precipitation during the previous 24 hours. The statistical downscaling methods are based on a statistical model that takes into account empirical relationships between large scale variables (used as predictors) and local scale variables. In the framework of the current study the statistical downscaling portal developed by the Santander Meteorology Group (https://www.meteo.unican.es/downscaling) in the framework of the EU project CLIMRUN (www.climrun.eu) was used to downscale non standard parameters related to forest fire risk. In this study, two different approaches were adopted. Firstly, the analogue downscaling technique was directly performed to the FWI index values and secondly the same downscaling technique was performed indirectly through the meteorological inputs of the index. In both cases, the statistical downscaling portal was used considering the ERA-Interim reanalysis as predictands due to the lack of observations at noon. Additionally, a three-dimensional (3D) interpolation method of position and elevation, based on Thin Plate Splines (TPS) was used, to interpolate the ERA-Interim data used to calculate the index

  16. Wind power planning: assessing long-term costs and benefits

    International Nuclear Information System (INIS)

    In the following paper, a new and straightforward technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic load duration curves to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. The model is applied to potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on CO2 charges, and capital costs for wind turbines and IGCC plant is also discussed. The methodology is intended for use by energy planners in assessing the social benefit of future investments in wind power

  17. Dynamic security risk assessment and optimization of power transmission system

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The paper presents a practical dynamic security region (PDSR) based dynamic security risk assessment and optimization model for power transmission system. The cost of comprehensive security control and the influence of uncertainties of power injections are considered in the model of dynamic security risk assessment. The transient stability constraints and uncertainties of power injections can be considered easily by PDSR in form of hyper-box. A method to define and classify contingency set is presented, and a risk control optimization model is given which takes total dynamic insecurity risk as the objective function for a dominant con-tingency set. An optimal solution of dynamic insecurity risk is obtained by opti-mizing preventive and emergency control cost and contingency set decomposition. The effectiveness of this model has been proved by test results on the New Eng-land 10-genarator 39-bus system.

  18. Social assessment of wind power. Part 2: Environmental evaluation of wind power

    International Nuclear Information System (INIS)

    This report focuses on the environmental aspects of wind power. The main aim of this report is to look at some of the most important environmental effects, to evaluate (monetize) these effects - if possible - and add these figures to the production cost estimates. It is not the intention to include all environmental aspects in the environmental assessment, but to analyse some of the most important, that is: The reduction of CO2, SO2 and NOx emitted to the atmosphere, e.g. from central power plants. Noise and visual effects from wind mills. Further, this report also assesses the influence of wind power on sustainability, i.e. that wind power, compared to power based on coal and natural gas, is a renewable energy resource. The valuation of the environmental effects is limited to the national effects. Since the valuation of the CO2-emissions is crucial to the main conclusion, some of the fundamental assumptions are: The Danish CO2-reduction goal is reached by using national abatement means. Therefore international agreements on distributing CO2 standards (joint implementation) are neglected; The power plants only produce electricity - not combined heat and power. If it is possible to utilize the production of heat, the environmental benefits from wind power will be reduced. It is uncertain whether the Danish, empirical investigations included in this report encompass the optimal abatement means and use them to the right extent. (EG) 55 refs

  19. A Statistical Approach to Planning Reserved Electric Power for Railway Infrastructure Administration

    Czech Academy of Sciences Publication Activity Database

    Brabec, Marek; Pelikán, Emil; Konár, Ondřej; Kasanický, Ivan; Juruš, Pavel; Sadil, J.; Blažek, P.

    Prague : CTU - Faculty of Transportation Sciences, 2013 - (Votruba, Z.; Jeřábek, M.), s. 311-318 ISBN 978-80-01-05320-1 Institutional support: RVO:67985807 Keywords : reserved capacity plannining * railway infrastructure * statistical modeling * extremal distribution * customized loss function Subject RIV: BB - Applied Statistics, Operational Research

  20. Complementary assessment of the safety of French nuclear power plants

    International Nuclear Information System (INIS)

    As an immediate consequence of the Fukushima accident the French nuclear safety Authority (ASN) asked EDF to perform a complementary safety assessment for each nuclear power plant dealing with 3 points: 1) the consequences of exceptional natural disasters, 2) the consequences of total loss of electrical power, and 3) the management of emergency situations. The safety margin has to be assessed considering 3 main points: first a review of the conformity to the initial safety requirements, secondly the resistance to events overdoing what the facility was designed to stand for, and the feasibility of any modification susceptible to improve the safety of the facility. This article details the specifications of such assessment, the methodology followed by EDF, the task organization and the time schedule. (A.C.)

  1. Shielding assessment of the ETRR-1 Reactor Under power upgrading

    International Nuclear Information System (INIS)

    The assessment of existing shielding of the ETRR-1 reactor in case of power upgrading is presented and discussed. It was carried out using both the present EK-10 type fuel elements and some other types of fuel elements with different enrichments. The shielding requirements for the ETRR-1 when power is upgraded are also discussed. The optimization curves between the upgraded reactor power and the shield thickness are presented. The calculation have been made using the ANISN code with the DLC-75 data library. The results showed that the present shield necessitates an additional layer of steel with thickness of 10.20 and 25 cm. When its power is upgraded to 3, 6 and 10 MWt in order to cutoff all neutron energy groups to be adequately safe under normal operating conditions. 4 figs

  2. Life Cycle Assessment of Coal-fired Power Production; TOPICAL

    International Nuclear Information System (INIS)

    Coal has the largest share of utility power generation in the US, accounting for approximately 56% of all utility-produced electricity (US DOE, 1998). Therefore, understanding the environmental implications of producing electricity from coal is an important component of any plan to reduce total emissions and resource consumption. A life cycle assessment (LCA) on the production of electricity from coal was performed in order to examine the environmental aspects of current and future pulverized coal boiler systems. Three systems were examined: (1) a plant that represents the average emissions and efficiency of currently operating coal-fired power plants in the US (this tells us about the status quo), (2) a new coal-fired power plant that meets the New Source Performance Standards (NSPS), and (3) a highly advanced coal-fired power plant utilizing a low emission boiler system (LEBS)

  3. Determining the Suitability of Two Different Statistical Techniques in Shallow Landslide (Debris Flow) Initiation Susceptibility Assessment in the Western Ghats

    OpenAIRE

    M. V. Ninu Krishnan; P. Pratheesh; Rejith, P. G.; H. Vijith

    2015-01-01

    In the present study, the Information Value (InfoVal) and the Multiple Logistic Regression (MLR) methods based on bivariate and multivariate statistical analysis have been applied for shallow landslide initiation susceptibility assessment in a selected subwatershed in the Western Ghats, Kerala, India, to determine the suitability of geographical information systems (GIS) assisted statistical landslide susceptibility assessment methods in the data constrained regions. The different landslide c...

  4. Steady state security assessment in deregulated power systems

    Science.gov (United States)

    Manjure, Durgesh Padmakar

    Power system operations are undergoing changes, brought about primarily due to deregulation and subsequent restructuring of the power industry. The primary intention of the introduction of deregulation in power systems was to bring about competition and improved customer focus. The underlying motive was increased economic benefit. Present day power system analysis is much different than what it was earlier, essentially due to the transformation of the power industry from being cost-based to one that is price-based and due to open access of transmission networks to the various market participants. Power is now treated as a commodity and is traded in an open market. The resultant interdependence of the technical criteria and the economic considerations has only accentuated the need for accurate analysis in power systems. The main impetus in security analysis studies is on efficient assessment of the post-contingency status of the system, accuracy being of secondary consideration. In most cases, given the time frame involved, it is not feasible to run a complete AC load flow for determining the post-contingency state of the system. Quite often, it is not warranted as well, as an indication of the state of the system is desired rather than the exact quantification of the various state variables. With the inception of deregulation, transmission networks are subjected to a host of multilateral transactions, which would influence physical system quantities like real power flows, security margins and voltage levels. For efficient asset utilization and maximization of the revenue, more often than not, transmission networks are operated under stressed conditions, close to security limits. Therefore, a quantitative assessment of the extent to which each transaction adversely affects the transmission network is required. This needs to be done accurately as the feasibility of the power transactions and subsequent decisions (execution, curtailment, pricing) would depend upon the

  5. Assessment of metals bioavailability to vegetables under field conditions using DGT, single extractions and multivariate statistics

    Directory of Open Access Journals (Sweden)

    Senila Marin

    2012-10-01

    Full Text Available Abstract Background The metals bioavailability in soils is commonly assessed by chemical extractions; however a generally accepted method is not yet established. In this study, the effectiveness of Diffusive Gradients in Thin-films (DGT technique and single extractions in the assessment of metals bioaccumulation in vegetables, and the influence of soil parameters on phytoavailability were evaluated using multivariate statistics. Soil and plants grown in vegetable gardens from mining-affected rural areas, NW Romania, were collected and analysed. Results Pseudo-total metal content of Cu, Zn and Cd in soil ranged between 17.3-146 mg kg-1, 141–833 mg kg-1 and 0.15-2.05 mg kg-1, respectively, showing enriched contents of these elements. High degrees of metals extractability in 1M HCl and even in 1M NH4Cl were observed. Despite the relatively high total metal concentrations in soil, those found in vegetables were comparable to values typically reported for agricultural crops, probably due to the low concentrations of metals in soil solution (Csoln and low effective concentrations (CE, assessed by DGT technique. Among the analysed vegetables, the highest metal concentrations were found in carrots roots. By applying multivariate statistics, it was found that CE, Csoln and extraction in 1M NH4Cl, were better predictors for metals bioavailability than the acid extractions applied in this study. Copper transfer to vegetables was strongly influenced by soil organic carbon (OC and cation exchange capacity (CEC, while pH had a higher influence on Cd transfer from soil to plants. Conclusions The results showed that DGT can be used for general evaluation of the risks associated to soil contamination with Cu, Zn and Cd in field conditions. Although quantitative information on metals transfer from soil to vegetables was not observed.

  6. Self-assessment of operational safety for nuclear power plants

    International Nuclear Information System (INIS)

    Self-assessment processes have been continuously developed by nuclear organizations, including nuclear power plants. Currently, the nuclear industry and governmental organizations are showing an increasing interest in the implementation of this process as an effective way for improving safety performance. Self-assessment involves the use of different types of tools and mechanisms to assist the organizations in assessing their own safety performance against given standards. This helps to enhance the understanding of the need for improvements, the feeling of ownership in achieving them and the safety culture as a whole. Although the primary beneficiaries of the self-assessment process are the plant and operating organization, the results of the self-assessments are also used, for example, to increase the confidence of the regulator in the safe operation of an installation, and could be used to assist in meeting obligations under the Convention on Nuclear Safety. Such considerations influence the form of assessment, as well as the type and detail of the results. The concepts developed in this report present the basic approach to self-assessment, taking into consideration experience gained during Operational Safety Review Team (OSART) missions, from organizations and utilities which have successfully implemented parts of a self-assessment programme and from meetings organized to discuss the subject. This report will be used in IAEA sponsored workshops and seminars on operational safety that include the topic of self-assessment

  7. SIESE - trimestrial bulletin - Synthesis 1994. Electric power summary statistics for Brazil

    International Nuclear Information System (INIS)

    The performance of the power system of all the Brazilian electrical utilities is presented. The data is given for each region in the country and includes, among other things, the electric power consumption and generation; the number of consumers and the electric power rates. 10 figs., 42 tabs

  8. Efforts to utilize risk assessment at nuclear power plants

    International Nuclear Information System (INIS)

    Risk assessment means the use of the outputs that have been obtained through risk identification and risk analysis (risk information), followed by the determination of the response policy by comparing these outputs with the risk of judgement standards. This paper discusses the use of risk information with multifaceted nature and its significance, and the challenges to the further penetration of these items. As the lessons and risk assessment learnt from the past accidents, this paper takes up the cases of the severe accidents of Three Mile Island, Chernobyl, and Fukushima Daiichi power stations, and discusses their causes and expansion factors. In particular, at Fukushima Daiichi Nuclear Power Station, important lessons were shortage in measures against the superimposition of earthquake and tsunami, and the insufficient use of risk assessment. This paper classified risk assessment from the viewpoint of risk information, and showed the contents and index for each item of risk reduction trends, risk increase trends, and measures according to the importance of risk. As the benefits of activities due to risk assessment, this paper referred to the application cases of the probabilistic risk assessment (PRA) of IAEA, and summarized the application activities of 10 items of risk indexes by classifying them to safety benefits and operational benefits. For example, in the item of flexible Allowed Outage Time (AOT), the avoidance of plant shutdown and the flexibility improvement of maintenance scheduling at a plant are corresponding to the above-mentioned benefits, respectively. (A.O.)

  9. Assessment of climate change statistical downscaling methods: Application and comparison of two statistical methods to a single site in Lisbon

    OpenAIRE

    Lopes, Pedro Miguel de Almeida Garrett Graça

    2008-01-01

    Climate change impacts are very dependent on regional geographical features, local climate variability, and socio-economic conditions. Impact assessment studies on climate change should therefore be performed at the local or at most at the regional level for the evaluation of possible consequences. However, climate scenarios are produced by Global Circulation Models for the entire Globe with spatial resolutions of several hundred kilometres. For this reason, downscaling methods are need...

  10. Safety assessment of emergency electric power systems for nuclear power plants

    International Nuclear Information System (INIS)

    This paper is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing a given design of the Emergency Electrical Power System. Those non-electric power systems which may be used in a plant design to serve as emergency energy sources are addressed only in their general safety aspects. The paper thus relates closely to Safety Series 50-SG-D7 ''Emergency Power Systems at Nuclear Power Plants'' (1982), as far as it addresses emergency electric power systems. Several aspects are dealt with: the information the assessor may expect from the applicant to fulfill his task of safety review; the main questions the reviewer has to answer in order to determine the compliance with requirements of the NUSS documents; the national or international standards which give further guidance on a certain system or piece of equipment; comments and suggestions which may help to judge a variety of possible solutions

  11. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  12. Micro and mini hydroelectric power assessment in Uruguay

    International Nuclear Information System (INIS)

    The School of Engineering in Montevideo, Uruguay, within the framework of Agreements made with the National Utility, has carried out an assessment of the potential and studies of the feasibility of the use of renewable energy for the generation of electrical power, both at the industrial level and the autonomous level for rural electrification. Original assessment methodologies were developed, including calculation tools which allow, for example, to analyze historical meteorological data, to calculate the available energy in different kinds of energy generators and also to stimulate the operation and design of autonomous systems with established load requirements and service quality. At the micro and mini hydropower assessment, the main role was placed on the census of potential users and the preliminary analysis of the representative places for the different technical solutions adequate to the variety of topographic conditions and load requirements. For power above 1 MW and up to 5 MW, the generating potential was assessed all over the country. If power lower than 1 MW or lower than 100kW (mini and micro) is considered, the information available in maps with contour lines, including in those of a 1:50,000 scale, is not enough to identify the most adequate places. Instead, knowledge of the place is indispensable in these cases. A preliminary plan of several installations was worked out. (Author)

  13. Statistical Dimensioning of Nutrient Loading Reduction: LLR Assessment Tool for Lake Managers

    Science.gov (United States)

    Kotamäki, Niina; Pätynen, Anita; Taskinen, Antti; Huttula, Timo; Malve, Olli

    2015-08-01

    Implementation of the EU Water Framework Directive (WFD) has set a great challenge on river basin management planning. Assessing the water quality of lakes and coastal waters as well as setting the accepted nutrient loading levels requires appropriate decision supporting tools and models. Uncertainty that is inevitably related to the assessment results and rises from several sources calls for more precise quantification and consideration. In this study, we present a modeling tool, called lake load response (LLR), which can be used for statistical dimensioning of the nutrient loading reduction. LLR calculates the reduction that is needed to achieve good ecological status in a lake in terms of total nutrients and chlorophyll a (chl- a) concentration. We show that by combining an empirical nutrient retention model with a hierarchical chl- a model, the national lake monitoring data can be used more efficiently for predictions to a single lake. To estimate the uncertainties, we separate the residual variability and the parameter uncertainty of the modeling results with the probabilistic Bayesian modeling framework. LLR has been developed to answer the urgent need for fast and simple assessment methods, especially when implementing WFD at such an extensive scale as in Finland. With a case study for an eutrophic Finnish lake, we demonstrate how the model can be utilized to set the target loadings and to see how the uncertainties are quantified and how they are accumulating within the modeling chain.

  14. Slovenian national landslide database as a basis for statistical assessment of landslide phenomena in Slovenia

    Science.gov (United States)

    Komac, Marko; Hribernik, Katarina

    2015-11-01

    Landslide databases on a national scale are an important tool for good spatial planning and for planning prevention measures or remediation activities. We have developed a modern national landslide database that enabled better landslide occurrence understanding, and will in the future help to assess landslide hazard, risk, potential damage, and enable more efficient landslide mitigation. In the paper landslide database construction steps and their properties are described. Following the collection of the landslide data from various sources and their input into the database the consistency of the database was assessed. Based on the data collected we have assessed basic statistical landslide properties, such as their overall spatial distribution, size and volume and the relation between them, landslide distribution in relation to engineering-geological units and different land-use, and past landslide mitigation activities. Analysis of landslide distribution also indicated areas in Slovenia where no landslide mapping was performed in the past, yet it should be, due to the high landslide susceptibility of these areas. Consequentially future national activities in relation to landslide problems should be governed primarily based on the findings of the database analyses to achieve the highest efficiency.

  15. Probabilistic assessment of power system transient stability incorporating SMES

    Science.gov (United States)

    Fang, Jiakun; Yao, Wei; Wen, Jinyu; Cheng, Shijie; Tang, Yuejin; Cheng, Zhuo

    2013-01-01

    This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically.

  16. Statistical annual report 2008 - Furnas Electric Power Plants Inc. - Calendar year 2007

    International Nuclear Information System (INIS)

    This 30th edition of the statistical annual report of Furnas reports the performance of the company in 2007 and recent years allowing a general view on: Furnas system; production and supply; financial and economic data, personnel and indicators

  17. PowerStaTim 1.0 – un nou program statistic de calcul a mărimii efectului și a puterii statistice

    OpenAIRE

    Sava, Florin A.; Laurențiu P. Maricuțoiu

    2008-01-01

    The present paper presents the main characteristics of a new software for computing effect size and statistical power indicators: PowerStaTim 1.0 (Maricuțoiu & Sava, 2007). The first part of the present paper presents the rationale for computing effect size and statistical power in psychological research. The second part of the article introduces the reader to the technical characteristics of PowerStaTim 1.0 and to the processing options of this software.

  18. Accelerating pairwise statistical significance estimation for local alignment by harvesting GPU's power

    OpenAIRE

    Zhang, Yuhong; Misra, Sanchit; Agrawal, Ankit; Patwary, Md Mostofa Ali; Liao, Wei-keng; Qin, Zhiguang; Choudhary, Alok

    2012-01-01

    Background Pairwise statistical significance has been recognized to be able to accurately identify related sequences, which is a very important cornerstone procedure in numerous bioinformatics applications. However, it is both computationally and data intensive, which poses a big challenge in terms of performance and scalability. Results We present a GPU implementation to accelerate pairwise statistical significance estimation of local sequence alignment using standard substitution matrices. ...

  19. The significance of structural power in Strategic Environmental Assessment

    International Nuclear Information System (INIS)

    This article presents a study of how power dynamics enables and constrains the influence of actors upon decision-making and Strategic Environmental Assessment (SEA). Based on structuration theory, a model for studying power dynamics in strategic decision-making processes is developed. The model is used to map and analyse key decision arenas in the decision process of aluminium production in Greenland. The analysis shows that communication lines are an important resource through which actors exercise power and influence decision-making on the location of the aluminium production. The SEA process involved not only reproduction of formal communication and decision competence but also production of alternative informal communication structures in which the SEA had capability to influence. It is concluded, that actors influence strategic decision making, and attention needs to be on not only the formal interactions between SEA process and strategic decision-making process but also on informal interaction and communication between actors as the informal structures, which can be crucial to the outcome of the decision-making process. This article is meant as a supplement to the understanding of power dynamics influence in IA processes and as a contribution to the IA research field with a method to analyse power dynamics in strategic decision-making processes. The article also brings reflections of strengths and weaknesses of using the structuration theory as an approach to power analysis. - Highlights: ► Informal interaction influenced process despite the presence of formalised rules. ► Interdependence of actors influenced SEA effectiveness. ► SEA practitioners successfully exercised power to influence decision-making. ► Power dynamics are properties of actors' interactions in decision-making. ► Power structures can be enabling and not solely limiting.

  20. Sustainability assessment of renewable power and heat generation technologies

    International Nuclear Information System (INIS)

    Rationalisation of consumption, more efficient energy usage and a new energy structure are needed to be achieved in order to shift the structure of energy system towards sustainability. The required energy system is among others characterised by intensive utilisation of renewable energy sources (RES). RES technologies have their own advantages and disadvantages. Nevertheless, for the strategic planning there is a great demand for the comparison of RES technologies. Furthermore, there are additional functions of RES utilisation expected beyond climate change mitigation, e.g. increment of employment, economic growth and rural development. The aim of the study was to reveal the most beneficial RES technologies with special respect to sustainability. Ten technologies of power generation and seven technologies of heat supply were examined in a multi-criteria sustainability assessment frame of seven attributes which were evaluated based on a choice experiment (CE) survey. According to experts the most important characteristics of RES utilisation technologies are land demand and social impacts i.e. increase in employment and local income generation. Concentrated solar power (CSP), hydropower and geothermal power plants are favourable technologies for power generation, while geothermal district heating, pellet-based non-grid heating and solar thermal heating can offer significant advantages in case of heat supply. - highlights: • We used choice experiment to estimate the weights of criteria for the sustainability assessment of RES technologies. • The most important attributes of RES technologies according to experts are land demand and social impacts. • Concentrated solar power (CSP), hydropower and geothermal power plants are advantageous technologies for power generation. • Geothermal district heating, pellet-based non-grid heating and solar thermal heating are favourable in case of heat supply

  1. A statistical method for assessing network stability using the Chow test.

    Science.gov (United States)

    Sotirakopoulos, Kostas; Barham, Richard; Piper, Ben; Nencini, Luca

    2015-10-01

    A statistical method is proposed for the assessment of stability in noise monitoring networks. The technique makes use of a variation of the Chow test applied between multiple measurement nodes placed at different locations and its novelty lies in the way it utilises a simple statistical test based on linear regression to uncover complex issues that can be difficult to expose otherwise. Measurements collected by a noise monitoring network deployed in the center of Pisa are used to demonstrate the capabilities and limitations of the test. It is shown that even in urban environments, where great soundscape variations are exhibited, accurate and robust results can be produced regardless of the proximity of the compared sensors as long as they are located in acoustically similar environments. Also it is shown that variations of the same method can be applied for self-testing on data collected by single stations. Finally it is presented that the versatility of the test makes it suitable for detection of various types of issues that can occur in real life network implementations; from slow drifts away from calibration, to severe, abrupt failures and noise floor shifts. PMID:26370835

  2. Proper Assessment of the JFK Assassination Bullet Lead Evidence from Metallurgical and Statistical Perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Randich, E; Grant, P M

    2006-08-29

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano, 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in Mannlicher-Carcano bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  3. A Statistical Method for Assessing Peptide Identification Confidence in Accurate Mass and Time Tag Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Stanley, Jeffrey R.; Adkins, Joshua N.; Slysz, Gordon W.; Monroe, Matthew E.; Purvine, Samuel O.; Karpievitch, Yuliya V.; Anderson, Gordon A.; Smith, Richard D.; Dabney, Alan R.

    2011-07-15

    High-throughput proteomics is rapidly evolving to require high mass measurement accuracy for a variety of different applications. Increased mass measurement accuracy in bottom-up proteomics specifically allows for an improved ability to distinguish and characterize detected MS features, which may in turn be identified by, e.g., matching to entries in a database for both precursor and fragmentation mass identification methods. Many tools exist with which to score the identification of peptides from LC-MS/MS measurements or to assess matches to an accurate mass and time (AMT) tag database, but these two calculations remain distinctly unrelated. Here we present a statistical method, Statistical Tools for AMT tag Confidence (STAC), which extends our previous work incorporating prior probabilities of correct sequence identification from LC-MS/MS, as well as the quality with which LC-MS features match AMT tags, to evaluate peptide identification confidence. Compared to existing tools, we are able to obtain significantly more high-confidence peptide identifications at a given false discovery rate and additionally assign confidence estimates to individual peptide identifications. Freely available software implementations of STAC are available in both command line and as a Windows graphical application.

  4. Sea cliff instability susceptibility at regional scale: a statistically based assessment in southern Algarve, Portugal

    Directory of Open Access Journals (Sweden)

    F. M. S. F. Marques

    2013-05-01

    along their top. The study was based on the application of the bivariate Information Value and multivariate Logistic regression statistical methods, using a set of predisposing factors for cliff failures, mainly related with geology (lithology, bedding dip, faults and geomorphology (maximum and mean slope, height, aspect, plan curvature, toe protection which were correlated with a photogrammetry based inventory of cliff failures occurred in a 60 yr period (1947–2007. The susceptibility models were validated against the inventory data using standard success rate and ROC curves, and provided encouraging results, indicating that the proposed approaches are effective for susceptibility assessment. The results obtained also stress the need for improvement of the predisposing factors to be used in this type of studies and the need of detailed and systematic cliff failures inventories.

  5. Robust statistical approaches to assess the degree of agreement of clinical data

    Science.gov (United States)

    Grilo, Luís M.; Grilo, Helena L.

    2016-06-01

    To analyze the blood of patients who took vitamin B12 for a period of time, two different medicine measurement methods were used (one is the established method, with more human intervention, and the other method uses essentially machines). Given the non-normality of the differences between both measurement methods, the limits of agreement are estimated using also a non-parametric approach to assess the degree of agreement of the clinical data. The bootstrap resampling method is applied in order to obtain robust confidence intervals for mean and median of differences. The approaches used are easy to apply, running a friendly software, and their outputs are also easy to interpret. In this case study the results obtained with (non)parametric approaches lead us to different statistical conclusions, but the decision whether agreement is acceptable or not is always a clinical judgment.

  6. Comparison of statistical downscaling methods for climate change assessment in Manitoba

    International Nuclear Information System (INIS)

    Downscaling of global climate model output is a necessary step in many climate change impact analyses. Manitoba Hydro has recently funded a project to investigate the merits of different downscaling methods for assessing the impact of climate change on river runoff in the Prairie region. The presentation will provide an overview of various statistical downscaling methods and highlight the strength and weaknesses of each method. The downscaling methods were evaluated based on a range of characteristics, including their theoretical foundation, their ability to reproduce observed climate, their ease of use, and availability of software. The downscaled climate data were used as input to the hydrologic model SLURP calibrated for several watersheds in northern Manitoba and in the Winnipeg River basin. The hydrologic model can be considered an integrator that smoothes out much of the day-to-day errors in the down-scaled climate data. The downscaling methods are also evaluated after post-processing through a hydrologic model. (author)

  7. Probabilistic assessment of power system transient stability incorporating SMES

    International Nuclear Information System (INIS)

    Highlights: ► Probabilistic study of power system with wind farm and SMES is proposed. ► Quantitative relationship between system stability and SMES capacity is given. ► System stability increases with the capacity of the SMES. ► System stability decreases with the penetration of wind power. ► Together with the cost function, the coil size is optimized. -- Abstract: This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically

  8. Probabilistic assessment of power system transient stability incorporating SMES

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jiakun, E-mail: Jiakun.Fang@gmail.com [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Yao, Wei [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Wen, Jinyu, E-mail: jinyu.wen@hust.edu.cn [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Cheng, Shijie; Tang, Yuejin; Cheng, Zhuo [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China)

    2013-01-15

    Highlights: ► Probabilistic study of power system with wind farm and SMES is proposed. ► Quantitative relationship between system stability and SMES capacity is given. ► System stability increases with the capacity of the SMES. ► System stability decreases with the penetration of wind power. ► Together with the cost function, the coil size is optimized. -- Abstract: This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically.

  9. Multivariate statistical assessment of predictors of firefighters' muscular and aerobic work capacity.

    Directory of Open Access Journals (Sweden)

    Ann-Sofie Lindberg

    Full Text Available Physical capacity has previously been deemed important for firefighters physical work capacity, and aerobic fitness, muscular strength, and muscular endurance are the most frequently investigated parameters of importance. Traditionally, bivariate and multivariate linear regression statistics have been used to study relationships between physical capacities and work capacities among firefighters. An alternative way to handle datasets consisting of numerous correlated variables is to use multivariate projection analyses, such as Orthogonal Projection to Latent Structures. The first aim of the present study was to evaluate the prediction and predictive power of field and laboratory tests, respectively, on firefighters' physical work capacity on selected work tasks. Also, to study if valid predictions could be achieved without anthropometric data. The second aim was to externally validate selected models. The third aim was to validate selected models on firefighters' and on civilians'. A total of 38 (26 men and 12 women + 90 (38 men and 52 women subjects were included in the models and the external validation, respectively. The best prediction (R2 and predictive power (Q2 of Stairs, Pulling, Demolition, Terrain, and Rescue work capacities included field tests (R2 = 0.73 to 0.84, Q2 = 0.68 to 0.82. The best external validation was for Stairs work capacity (R2 = 0.80 and worst for Demolition work capacity (R2 = 0.40. In conclusion, field and laboratory tests could equally well predict physical work capacities for firefighting work tasks, and models excluding anthropometric data were valid. The predictive power was satisfactory for all included work tasks except Demolition.

  10. Fighting bias with statistics: Detecting gender differences in responses to items on a preschool science assessment

    Science.gov (United States)

    Greenberg, Ariela Caren

    Differential item functioning (DIF) and differential distractor functioning (DDF) are methods used to screen for item bias (Camilli & Shepard, 1994; Penfield, 2008). Using an applied empirical example, this mixed-methods study examined the congruency and relationship of DIF and DDF methods in screening multiple-choice items. Data for Study I were drawn from item responses of 271 female and 236 male low-income children on a preschool science assessment. Item analyses employed a common statistical approach of the Mantel-Haenszel log-odds ratio (MH-LOR) to detect DIF in dichotomously scored items (Holland & Thayer, 1988), and extended the approach to identify DDF (Penfield, 2008). Findings demonstrated that the using MH-LOR to detect DIF and DDF supported the theoretical relationship that the magnitude and form of DIF and are dependent on the DDF effects, and demonstrated the advantages of studying DIF and DDF in multiple-choice items. A total of 4 items with DIF and DDF and 5 items with only DDF were detected. Study II incorporated an item content review, an important but often overlooked and under-published step of DIF and DDF studies (Camilli & Shepard). Interviews with 25 female and 22 male low-income preschool children and an expert review helped to interpret the DIF and DDF results and their comparison, and determined that a content review process of studied items can reveal reasons for potential item bias that are often congruent with the statistical results. Patterns emerged and are discussed in detail. The quantitative and qualitative analyses were conducted in an applied framework of examining the validity of the preschool science assessment scores for evaluating science programs serving low-income children, however, the techniques can be generalized for use with measures across various disciplines of research.

  11. Water Quality Assessment of Gufu River in Three Gorges Reservoir (China Using Multivariable Statistical Methods

    Directory of Open Access Journals (Sweden)

    Jiwen Ge

    2013-07-01

    Full Text Available To provide the reasonable basis for scientific management of water resources and certain directive significance for sustaining health of Gufu River and even maintaining the stability of water ecosystem of the Three-Gorge Reservoir of Yangtze River, central China, multiple statistical methods including Cluster Analysis (CA, Discriminant Analysis (DA and Principal Component Analysis (PCA were performed to assess the spatial-temporal variations and interpret water quality data. The data were obtained during one year (2010~2011 of monitoring of 13 parameters at 21 different sites (3003 observations, Hierarchical CA classified 11 months into 2 periods (the first and second periods and 21 sampling sites into 2 clusters, namely, respectively upper reaches with little anthropogenic interference (UR and lower reaches running through the farming areas and towns that are subjected to some human interference (LR of the sites, based on similarities in the water quality characteristics. Eight significant parameters (total phosphorus, total nitrogen, temperature, nitrate nitrogen, total organic carbon, total hardness, total alkalinity and silicon dioxide were identified by DA, affording 100% correct assignations for temporal variation analysis, and five significant parameters (total phosphorus, total nitrogen, ammonia nitrogen, electrical conductivity and total organic carbon were confirmed with 88% correct assignations for spatial variation analysis. PCA (varimax functionality was applied to identify potential pollution sources based on the two clustered regions. Four Principal Components (PCs with 91.19 and 80.57% total variances were obtained for the Upper Reaches (UR and Lower Reaches (LR regions, respectively. For the UR region, the rainfall runoff, soil erosion, scouring weathering of crustal materials and forest areas are the main sources of pollution. The pollution sources for the LR region are anthropogenic sources (domestic and agricultural runoff

  12. West European nuclear power generation research and development. (Foreign Applied Sciences Assessment Center Technical Assessment Report)

    International Nuclear Information System (INIS)

    The report assesses the current and projected future status of nuclear power generation research and development (R and D) in Western Europe. The primary focus is on light-water reactor technology, but alternative concepts-specifically, high-temperature gas-cooled reactors and liquid-metal reactors-are also assessed. Nuclear power R and D for light-water reactors can have immediate commercial significance, and therefore is mostly conducted within single organizations or countries. Thus, the assessments presented in the report are, in most instances, organized around countries rather than Western Europe collectively. The advancement of nuclear power is dependent upon advances in each stage of the nuclear fuel cycle. To bound the study, the assessment includes only the following nuclear fuel cycle stages: fuel fabrication, power generation, and fuel reprocessing. Specific topics addressed within these fuel cycle stages include core reactor physics, materials, instrumentation and control systems, nuclear power safety, power plant fabrication and construction, fuel fabrication, and reprocessing technology. Excluded are the front-end fuel cycle stages of mining and milling, conversion, and enrichment, and the back-end fuel cycle stages of waste conditioning and disposal. Four nations in Western Europe emerge as having the dominant R and D base for light-water reactors: in order of significance, France, Germany, Sweden, and the United Kingdom

  13. Nuclear power plants: 2006 atw compact statistics; atw Schnellstatistik Kernkraftwerke 2006

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2007-01-15

    At the turn of 2006/2007, nuclear power plants were available for energy supply, or under construction, in 32 countries of the world. A total of 437 nuclear power plants, which is 7 plants less than at the 2005/2006 turn, were in operation in 31 countries with an aggregate gross power of approx. 388 GWe and an aggregate net power, respectively, of 369 GWe. The available gross power of nuclear power plants dropped by approx. 1.6 GWe, the available net power, by approx. 1.2 GWe. The Tarapur 3 nuclear generating unit was commissioned in India, a D{sub 2}O PWR of 540 MWe gross power. Power operation was discontinued for good in 2006 only in nuclear power plants in Europe: Bohunice 1 (Slovak Republic, 440/408 MWe, VVER PWR); Kozloduy 3 and Kozloduy 4 (Bulgaria, 440/408 MWe each, VVER PWR); Dungeness A1 and Dungeness A2 (United Kingdom, 245/219 MWe each, Magnox GGR); Sizewell A1 and Sizewell A2 (United Kingdom, 236/210 MWe each, Magnox GGR), and Jose Cabrera 1 (Zorita) (Spain, 160/153 MWe, PWR). 29 nuclear generating units, i.e. 8 plants more than at the end of 2005, with an aggregate gross power of approx. 28 GWe, were under construction in 10 countries end of 2006. In China, construction of the Qinshan II-3, Qinshan II-4 nuclear generating units was started. In the Republic of Korea, construction work began on 4 new projects: Shin Kori 1, Shin Kori 2, and Shin Wolsong 1, Shin Wolsong 2. In Russia, work was resumed on the BN-800 sodium-cooled fast breeder reactor project at Beloyarsk and the RBMK Kursk 5. Some 40 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another approximately seventy units are in their preliminary project phases. (orig.)

  14. Thermal impact assessment of multi power plant operations on estuaries

    International Nuclear Information System (INIS)

    The assessment of the thermal impact of multi power plant operations on large estuaries requires careful consideration of the problems associated with: re-entrainment, re-circulation, thermal interaction, delay in the attainment of thermal equilibrium state, and uncertainty in specifying open boundaries and open boundary conditions of the regions, which are critically important in the analysis of the thermal conditions in receiving water bodies with tidal dominated, periodically reversing flow conditions. The results of an extensive study in the Hudson River at Indian Point, 42 miles upstream of the ocean end at the Battery, concluded that the tidal-transient, multi-dimensional discrete-element (UTA) thermal transport models (ESTONE, FLOTWO, TMPTWO computer codes) and the near-field far-field zone-matching methodology can be employed with a high degree of reliability in the assessment of the thermal impact of multi power plant operations on tidal dominated estuaries

  15. The assessment of tornado missile hazard to nuclear power plants

    International Nuclear Information System (INIS)

    Numerical methods and computer codes for assessing tornado missile hazards to nuclear power plants are developed. The method of calculation has been based on the theoretical model developed earlier by authors. Historical data for tornado characteristics are taken from computerized files of the National Severe Storms Forecast Center and potential missiles characteristics are adopted from an EPRI report. Due to the uncertainty and randomness of tornado and tornado-generated missiles' characteristics, the damage probability of targets has highly spread distribution. The proposed method is very useful for assessing the risk of not providing protection to some nonsafety-related targets whose failure can create a hazard to the safe operation of nuclear power plants

  16. Statistical and regulatory considerations in assessments of interchangeability of biological drug products.

    Science.gov (United States)

    Tóthfalusi, Lászlo; Endrényi, László; Chow, Shein-Chung

    2014-05-01

    When the patent of a brand-name, marketed drug expires, new, generic products are usually offered. Small-molecule generic and originator drug products are expected to be chemically identical. Their pharmaceutical similarity can be typically assessed by simple regulatory criteria such as the expectation that the 90% confidence interval for the ratio of geometric means of some pharmacokinetic parameters be between 0.80 and 1.25. When such criteria are satisfied, the drug products are generally considered to exhibit therapeutic equivalence. They are then usually interchanged freely within individual patients. Biological drugs are complex proteins, for instance, because of their large size, intricate structure, sensitivity to environmental conditions, difficult manufacturing procedures, and the possibility of immunogenicity. Generic and brand-name biologic products can be expected to show only similarity but not identity in their various features and clinical effects. Consequently, the determination of biosimilarity is also a complicated process which involves assessment of the totality of the evidence for the close similarity of the two products. Moreover, even when biosimilarity has been established, it may not be assumed that the two biosimilar products can be automatically substituted by pharmacists. This generally requires additional, careful considerations. Without declaring interchangeability, a new product could be prescribed, i.e. it is prescribable. However, two products can be automatically substituted only if they are interchangeable. Interchangeability is a statistical term and it means that products can be used in any order in the same patient without considering the treatment history. The concepts of interchangeability and prescribability have been widely discussed in the past but only in relation to small molecule generics. In this paper we apply these concepts to biosimilars and we discuss: definitions of prescribability and interchangeability and

  17. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    International Nuclear Information System (INIS)

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8±4.3 years; mean±SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the potential

  18. Assessing incentive policies for integrating centralized solar power generation in the Brazilian electric power system

    International Nuclear Information System (INIS)

    This study assesses the impacts of promoting, through auctions, centralized solar power generation (concentrated solar power – CSP, and photovoltaic solar panels – PV) on the Brazilian power system. Four types of CSP plants with parabolic troughs were simulated at two sites, Bom Jesus da Lapa and Campo Grande, and PV plants were simulated at two other sites, Recife and Rio de Janeiro. The main parameters obtained for each plant were expanded to other suitable sites in the country (totaling 17.2 GW in 2040), as inputs in an optimization model for evaluating the impacts of the introduction of centralized solar power on the expansion of the electricity grid up to 2040. This scenario would be about USD$ 185 billion more expensive than a business as usual scenario, where expansion solely relies on least-cost options. Hence, for the country to incentivize the expansion of centralized solar power, specific auctions for solar energy should be adopted, as well as complementary policies to promote investments in R and D and the use of hybrid systems based on solar and fuels in CSP plants. - Highlights: • We assess the impacts of promoting centralized CSP and PV by auctions in Brazil. • We simulate energy scenarios with and without solar power. • Our solar scenario leads to 17 GW of solar capacity installed between 2020 and 2040. • This solar scenario is some USD$ 185 billion more expensive than the base case

  19. Enhanced seismic risk assessment of the Diablo Canyon power plant

    International Nuclear Information System (INIS)

    A seismic probabilistic risk assessment (PRA) was performed of the Diablo Canyon Nuclear Power Plant. The PRA is part of a reevaluation of the seismic design bases for the plant that is required by a license condition specified by the U.S. Nuclear Regulatory Commission. The overall effort generated extensive new geologic and seismologic data as well as new models and evaluations that are reported in this paper

  20. Statistical analysis of fuel failures in large break loss-of-coolant accident (LBLOCA) in EPR type nuclear power plant

    International Nuclear Information System (INIS)

    Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)

  1. On-line Dynamic Security Assessment in Power Systems

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel

    solar radiation. Moreover, ongoing research suggests that demand response will be introduced to maintain power balance between generation and consumption at all times. Due to these changes the operating point of the power system will be less predictable and today’s stability and security assessment...... in-depth study of the mechanism causing the voltage sags. The first sensitivity type is called load voltage i/xii sensitivity and allows identifying which bus voltages are affected by a change in rotor angle of a particular generator. The second proposed type is called generator power sensitivity...... development of a method for early prediction of critical voltage sags is described. The method’s performance is compared to other prediction approaches. The results show that the proposed method succeeds in early, accurately and consistently predicting critically low voltage sags. An efficient on-line DSA not...

  2. Enhanced statistical tests for GWAS in admixed populations: assessment using African Americans from CARe and a Breast Cancer Consortium.

    Directory of Open Access Journals (Sweden)

    Bogdan Pasaniuc

    2011-04-01

    Full Text Available While genome-wide association studies (GWAS have primarily examined populations of European ancestry, more recent studies often involve additional populations, including admixed populations such as African Americans and Latinos. In admixed populations, linkage disequilibrium (LD exists both at a fine scale in ancestral populations and at a coarse scale (admixture-LD due to chromosomal segments of distinct ancestry. Disease association statistics in admixed populations have previously considered SNP association (LD mapping or admixture association (mapping by admixture-LD, but not both. Here, we introduce a new statistical framework for combining SNP and admixture association in case-control studies, as well as methods for local ancestry-aware imputation. We illustrate the gain in statistical power achieved by these methods by analyzing data of 6,209 unrelated African Americans from the CARe project genotyped on the Affymetrix 6.0 chip, in conjunction with both simulated and real phenotypes, as well as by analyzing the FGFR2 locus using breast cancer GWAS data from 5,761 African-American women. We show that, at typed SNPs, our method yields an 8% increase in statistical power for finding disease risk loci compared to the power achieved by standard methods in case-control studies. At imputed SNPs, we observe an 11% increase in statistical power for mapping disease loci when our local ancestry-aware imputation framework and the new scoring statistic are jointly employed. Finally, we show that our method increases statistical power in regions harboring the causal SNP in the case when the causal SNP is untyped and cannot be imputed. Our methods and our publicly available software are broadly applicable to GWAS in admixed populations.

  3. Health assessment of electric power distribution system in NPP

    International Nuclear Information System (INIS)

    Uninterrupted electrical power is required to continue the decay heat removal and containment of radioactive leakage, subsequent to reactor shut-down following an accident. Assessment of the integrity of electrical power supply system, after an accident, and restoration of supply, at least to the electrical networks feeding the critical loads, is therefore of utmost importance. The availability of one or more of the emergency power sources - DGs, batteries, offsite power etc. - and the survival of the electrical distribution network up to the critical loads are the necessary conditions for recovery. External sources may be required to be connected in-situ, if none of the emergency sources are available. Few cables may have to be laid if the distribution network is not recoverable. To carry out these activities in a post-accident scenario, it is highly desirable to have an in-situ diagnostic system that can assess and present the health of the electrical distribution network to the plant personnel. This paper discusses about a diagnostic technique capable of monitoring and detecting hard (e.g., Open circuit) as well as soft faults created due to rapid degradation of the cables caused due to fire/heat or other mechanisms. (author)

  4. Sites and social assessment of nuclear power plants

    International Nuclear Information System (INIS)

    The sites of nuclear power plants in Japan has the features, first the extreme expectation for regional development because of the selection of depopulated districts for most locations, and the second, apprehensions of local people for two reasons of nuclear power generating techniques which do not plant the roots in society and handling of radioactive materials. In order to cope with these problems, it is necessary to consider that the development plan of the regions around reactor sites must be compiled systematically. Its premise is the ''social assessment'' which estimates the economical and social influences and evaluates the merit and demerit of nuclear power plants prior to the construction. This is of course inevitable. The objects of the assessment may be divided as follows: the human effect to individuals, the institutional effect to local community, the economical effect to region, and the national influence to the whole country. While the developmental action of locations includes the stages of examination, planning, construction and operation, and three location patterns are recognized according to the emphasized function, the improvement of national economy, upgrading of environmental quality, and the most priority in local welfare. In the process of the assessment, the following items may be taken notice that each item requires sometimes the weighting; the pattern to abandon location may exist; positive and negative effects are required to be distributed evenly in a triangle having the apexes each representing one of the above three patterns. (Wakatsuki, Y.)

  5. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    Science.gov (United States)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  6. Improved Statistical Power with a Sparse Shape Model in Detecting an Aging Effect in the Hippocampus and Amygdala.

    Science.gov (United States)

    Chung, Moo K; Kim, Seung-Goo; Schaefer, Stacey M; van Reekum, Carien M; Peschke-Schmitz, Lara; Sutterer, Matthew J; Davidson, Richard J

    2014-03-21

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace-Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power. PMID:25302007

  7. A low-power network search engine based on statistical partitioning

    OpenAIRE

    Basci, F; Kocak, T

    2004-01-01

    Network search engines based on ternary CAMs (content addressable memories) are widely used in routers. However, due to the parallel search nature of TCAMs, power consumption becomes a critical issue. We propose an architecture that partitions the lookup table into multiple TCAM chips, based on the individual TCAM cell status, and achieves lower power figures

  8. Understanding Statistical Power in Cluster Randomized Trials: Challenges Posed by Differences in Notation and Terminology

    Science.gov (United States)

    Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael

    2014-01-01

    Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…

  9. The Power of Student's t and Wilcoxon W Statistics: A Comparison.

    Science.gov (United States)

    Rasmussen, Jeffrey Lee

    1985-01-01

    A recent study (Blair and Higgins, 1980) indicated a power advantage for the Wilcoxon W Test over student's t-test when calculated from a common mixed-normal sample. Results of the present study indicate that the t-test corrected for outliers shows a superior power curve to the Wilcoxon W.

  10. A Powerful Test of the Autoregressive Unit Root Hypothesis Based on a Tuning Parameter Free Statistic

    DEFF Research Database (Denmark)

    Nielsen, Morten Ørregaard

    bandwidth, lag length, etc., but have none of these three properties. It is shown that members of the family with d < 1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric...

  11. Suppressing the non-Gaussian statistics of Renewable Power from Wind and Solar

    CERN Document Server

    Anvari, M; Tabar, M Reza Rahimi; Wächter, M; Milan, P; Heinemann, D; Peinke, Joachim; Lorenz, E

    2015-01-01

    The power from wind and solar exhibits a nonlinear flickering variability, which typically occurs at time scales of a few seconds. We show that high-frequency monitoring of such renewable powers enables us to detect a transition, controlled by the field size, where the output power qualitatively changes its behaviour from a flickering type to a diffusive stochastic behaviour. We find that the intermittency and strong non-Gaussian behavior in cumulative power of the total field, even for a country-wide installation still survives for both renewable sources. To overcome the short time intermittency, we introduce a time-delayed feedback method for power output of wind farm and solar field that can change further the underlying stochastic process and suppress their strong non- gaussian fluctuations.

  12. Using Saliency-Weighted Disparity Statistics for Objective Visual Comfort Assessment of Stereoscopic Images

    Science.gov (United States)

    Zhang, Wenlan; Luo, Ting; Jiang, Gangyi; Jiang, Qiuping; Ying, Hongwei; Lu, Jing

    2016-06-01

    Visual comfort assessment (VCA) for stereoscopic images is a particularly significant yet challenging task in 3D quality of experience research field. Although the subjective assessment given by human observers is known as the most reliable way to evaluate the experienced visual discomfort, it is time-consuming and non-systematic. Therefore, it is of great importance to develop objective VCA approaches that can faithfully predict the degree of visual discomfort as human beings do. In this paper, a novel two-stage objective VCA framework is proposed. The main contribution of this study is that the important visual attention mechanism of human visual system is incorporated for visual comfort-aware feature extraction. Specifically, in the first stage, we first construct an adaptive 3D visual saliency detection model to derive saliency map of a stereoscopic image, and then a set of saliency-weighted disparity statistics are computed and combined to form a single feature vector to represent a stereoscopic image in terms of visual comfort. In the second stage, a high dimensional feature vector is fused into a single visual comfort score by performing random forest algorithm. Experimental results on two benchmark databases confirm the superior performance of the proposed approach.

  13. No-Reference Image Quality Assessment for ZY3 Imagery in Urban Areas Using Statistical Model

    Science.gov (United States)

    Zhang, Y.; Cui, W. H.; Yang, F.; Wu, Z. C.

    2016-06-01

    More and more high-spatial resolution satellite images are produced with the improvement of satellite technology. However, the quality of images is not always satisfactory for application. Due to the impact of complicated atmospheric conditions and complex radiation transmission process in imaging process the images often suffer deterioration. In order to assess the quality of remote sensing images over urban areas, we proposed a general purpose image quality assessment methods based on feature extraction and machine learning. We use two types of features in multi scales. One is from the shape of histogram the other is from the natural scene statistics based on Generalized Gaussian distribution (GGD). A 20-D feature vector for each scale is extracted and is assumed to capture the RS image quality degradation characteristics. We use SVM to learn to predict image quality scores from these features. In order to do the evaluation, we construct a median scale dataset for training and testing with subjects taking part in to give the human opinions of degraded images. We use ZY3 satellite images over Wuhan area (a city in China) to conduct experiments. Experimental results show the correlation of the predicted scores and the subjective perceptions.

  14. Sustainability indicators for the assessment of nuclear power

    International Nuclear Information System (INIS)

    Electricity supplies an increasing share of the world's total energy demand and that contribution is set to increase. At the same time, there is increasing socio-political will to mitigate impacts of climate change as well as to improve energy security. This, in combination with the desire to ensure social and economic prosperity, creates a pressing need to consider the sustainability implications of future electricity generation. However, approaches to sustainability assessment differ greatly in their scope and methodology as currently there is no standardised approach. With this in mind, this paper reviews sustainability indicators that have previously been used to assess energy options and proposes a new sustainability assessment methodology based on a life cycle approach. In total, 43 indicators are proposed, addressing the techno-economic, environmental and social sustainability issues associated with energy systems. The framework has been developed primarily to address concerns associated with nuclear power in the UK, but is applicable to other energy technologies as well as to other countries. -- Highlights: → New framework for life cycle sustainability assessment of nuclear power developed. → The framework comprises 43 indicators addressing techno-economic, environmental and social sustainability. → Completely new indicators developed to address different sustainability issues, including nuclear proliferation, energy supply diversity and intergenerational equity. → The framework enables sustainability comparisons of nuclear and other electricity technologies. → Indicators can be used by various stakeholders, including industry, policy makers and NGOs to help identify more sustainable electricity options.

  15. Impact assessment of tornado against nuclear power plant

    International Nuclear Information System (INIS)

    The impact assessment of tornado against nuclear power plants conforms to the 'Assessment guide for tornado effect on nuclear power plants' stipulated by the Nuclear Regulation Authority. In face of the assessment, important items are the setting of the maximum wind speed considered in design, and the setting of a flying object evaluation model, on the basis of observation results. The Japan Society of Maintenology summarized the verification results of the concept on the setting of tornado design and flying object valuation model, the contents of which are explained here. The following are explained: (1) validity of the setting of tornado design in the Assessment Guide, (2) analysis of synoptic field, (3) study on the regional characteristics of tornado occurrence environmental field by means of the analysis of synoptic field and gust associated index, and (4) setting of tornado design based on the above (1)-(3). Next, on the flying object evaluation model, the authors picked up the Rankine vortex model and Fujita model, and verified the reproducibility of the models using the features of each and the actual state of tornado damage. (A.O.)

  16. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [University of Kansas; Mechem, David [University of Kansas; Ma, Chunsheng [Wichita State University

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  17. Reliability assessment of distribution power systems including distributed generations

    International Nuclear Information System (INIS)

    Nowadays, power systems have reached a good level of reliability. Nevertheless, considering the modifications induced by the connections of small independent producers to distribution networks, there's a need to assess the reliability of these new systems. Distribution networks present several functional characteristics, highlighted by the qualitative study of the failures, as dispersed loads at several places, variable topology and some electrotechnical phenomena which must be taken into account to model the events that can occur. The adopted reliability calculations method is Monte Carlo simulations, the probabilistic method most powerful and most flexible to model complex operating of the distribution system. We devoted a first part on the case of a 20 kV feeder to which a cogeneration unit is connected. The method was applied to a software of stochastic Petri nets simulations. Then a second part related to the study of a low voltage power system supplied by dispersed generations. Here, the complexity of the events required to code the method in an environment of programming allowing the use of power system calculations (load flow, short-circuit, load shedding, management of units powers) in order to analyse the system state for each new event. (author)

  18. Quantitative assessment of aquatic impacts of power plants

    International Nuclear Information System (INIS)

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers

  19. Quantitative assessment of aquatic impacts of power plants

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Arnold, E.M.; Skalski, J.R.; Fickeisen, D.H.; Baker, K.S.

    1979-08-01

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers.

  20. Preliminary environmental assessment for the satellite power system (SPS)

    Energy Technology Data Exchange (ETDEWEB)

    1978-10-01

    A preliminary assessment of the impact of the Satellite Power System (SPS) on the environment is presented. Information that has appeared in documents referenced herein is integrated and assimilated. The state-of-knowledge as perceived from recently completed DOE-sponsored studies is disclosed, and prospective research and study programs that can advance the state-of-knowledge and provide an expanded data base for use in an assessment planned for 1980 are defined. Alternatives for research that may be implemented in order to achieve this advancement are also discussed in order that a plan can be selected which will be consistent with the fiscal and time constraints on the SPS Environmental Assessment Program. Health and ecological effects of microwave radiation, nonmicrowave effects on health and the environment (terrestrial operations and space operations), effects on the atmosphere, and effects on communications systems are examined in detail. (WHK)

  1. Two universal physical principles shape the power-law statistics of real-world networks

    CERN Document Server

    Lorimer, Tom; Stoop, Ruedi

    2015-01-01

    The study of complex networks has pursued an understanding of macroscopic behavior by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex networks generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations.

  2. Two universal physical principles shape the power-law statistics of real-world networks

    Science.gov (United States)

    Lorimer, Tom; Gomez, Florian; Stoop, Ruedi

    2015-07-01

    The study of complex networks has pursued an understanding of macroscopic behaviour by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex network generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations.

  3. Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Lee, Andrew K. [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Sahoo, Narayan [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tucker, Susan L. [Department of Bioinformatics and Computational Biology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei, E-mail: dong.lei@scrippshealth.org [Scripps Proton Therapy Center, San Diego, California (United States)

    2013-08-01

    Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dose–volume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dose–volume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of −1.1% (−0.9% for breath-hold), −0.3%, and −2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.

  4. ASSESSMENT OF OIL PALM PLANTATION AND TROPICAL PEAT SWAMP FOREST WATER QUALITY BY MULTIVARIATE STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Seca Gandaseca

    2014-01-01

    Full Text Available This study reports the spatio-temporal changes in river and canal water quality of peat swamp forest and oil palm plantation sites of Sarawak, Malaysia. To investigate temporal changes, 192 water samples were collected at four stations of BatangIgan, an oil palm plantation site of Sarawak, during July-November in 2009 and April-July in 2010. Nine water quality parameters including Electrical Conductivity (EC, pH, Turbidity (TER, Dissolved Oxygen (DO, Temperature (TEMP, Chemical Oxygen Demand (COD, five-day Biochemical Oxygen Demand (BOD5, ammonia-Nitrogen (NH3-N, Total Suspended Solids (TSS were analysed. To investigate spatial changes, 432water samples were collected from six different sites including BatangIgan during June-August 2010. Six water quality parameters including pH, DO, COD, BOD5, NH3-N and TSS were analysed to see the spatial variations. Most significant parameters which contributed in spatio-temporal variations were assessed by statistical techniques such as Hierarchical Agglomerative Cluster Analysis (HACA, Factor Analysis/Principal Components Analysis (FA/PCA and Discriminant Function Analysis (DFA. HACA identified three different classes of sites: Relatively Unimpaired, Impaired and Less Impaired Regions on the basis of similarity among different physicochemical characteristics and pollutant level between the sampling sites. DFA produced the best results for identification of main variables for temporal analysis and separated parameters (EC, TER, COD and identified three parameters for spatial analysis (pH, NH3-N and BOD5. The results signify that parameters identified by statistical analyses were responsible for water quality change and suggest the possibility the agricultural and oil palm plantation activities as a source of pollutants. The results suggest dire need for proper watershed management measures to restore the water quality of this tributary for a

  5. A follow-up power analysis of the statistical tests used in the Journal of Research in Science Teaching

    Science.gov (United States)

    Woolley, Thomas W.; Dawson, George O.

    It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.

  6. Wide Area Measurement Based Security Assessment & Monitoring of Modern Power System: A Danish Power System Case Study

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2013-01-01

    Power System security has become a major concern across the global power system community. This paper presents wide area measurement system (WAMS) based security assessment and monitoring of modern power system. A new three dimensional security index (TDSI) has been proposed for online security m...... demonstrated in DigSILENT PowerFactory environment....

  7. Statistical power to detect change in a mangrove shoreline fish community adjacent to a nuclear power plant.

    Science.gov (United States)

    Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E

    2016-03-01

    An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33 % changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33 %) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives. PMID:26903208

  8. Reactor noise analysis applications in Ontario Hydro: a statistical technique used for systems surveillance and condition assessment

    International Nuclear Information System (INIS)

    Reactor noise analysis is a non-intrusive statistical technique regularly used in surveillance and diagnostics tasks. Valuable information on reactor system dynamics can be extracted from the fluctuations of instrumentation signals measured during steady-state operation. The small and measurable fluctuations of process signals are the results of stochastic effects inherent in physical processes, such as heat transfer, boiling, coolant flow turbulence, fission process, structural vibrations and pressure oscillations. The goal of reactor noise analysis is to monitor and assess the conditions of technological processes and their instrumentation in the nuclear reactor in a non-intrusive passive way. The noise measurements are usually performed at a steady-state operation, while the availability of the signals in their respected systems (i.e. shutdown systems, regulating system) is not interrupted. This paper concentrates on recent applications of reactor noise analysis in Ontario Hydro's CANDU stations, related to the dynamics of in-core flux detectors (ICFDs) and ion chambers. These applications include (1) detecting anomalies in the dynamics of ICFDs and ion chambers, (2) estimating the effective prompt fractions of ICFDs in power rundown tests and in noise measurements, (3) detecting the mechanical vibration of ICFD instrument tubes induced by moderator flow, (4) detecting the mechanical vibration of fuel channels induced by coolant flow, (5) identifying the cause of excessive signal fluctuations in certain flux detectors, (6) validating the dynamic coupling between liquid zone control signals. (author)

  9. A subsampling approach to estimating the distribution of diversing statistics with application to assessing financial market risks

    OpenAIRE

    Bertail, Patrice; Haefke, Christian; Politis, Dimitris N.; White, Halbert

    2001-01-01

    In this paper we propose a subsampling estimator for the distribution of statistics diverging at either known rates when the underlying time series in strictly stationary abd strong mixing. Based on our results we provide a detailed discussion how to estimate extreme order statistics with dependent data and present two applications to assessing financial market risk. Our method performs well in estimating Value at Risk and provides a superior alternative to Hill's estimator ...

  10. A escolha do teste estatístico - um tutorial em forma de apresentação em PowerPoint A PowerPoint®-based guide to assist in choosing the suitable statistical test

    Directory of Open Access Journals (Sweden)

    David Normando

    2010-02-01

    Full Text Available A seleção de métodos apropriados para a análise estatística pode parecer complexa, principalmente para estudantes de pós-graduação e pesquisadores no início da carreira científica. Por outro lado, a apresentação em PowerPoint é uma ferramenta comum para estudantes e pesquisadores. Assim, um tutorial de Bioestatística desenvolvido em uma apresentação em PowerPoint poderia estreitar a distância entre ortodontistas e a Bioestatística. Esse guia proporciona informações úteis e objetivas a respeito de vários métodos estatísticos empregando exemplos relacionados à Odontologia e, mais especificamente, à Ortodontia. Esse tutorial deve ser empregado, principalmente, para o usuário obter algumas respostas a questões comuns relacionadas ao teste mais apropriado para executar comparações entre grupos, examinar correlações e regressões ou analisar o erro do método. Também pode ser obtido auxílio para checar a distribuição dos dados (normal ou anormal e a escolha do gráfico mais adequado para a apresentação dos resultados. Esse guia* pode ainda ser de bastante utilidade para revisores de periódicos examinarem, de forma rápida, a adequabilidade do método estatístico apresentado em um artigo submetido à publicação.Selecting appropriate methods for statistical analysis may be difficult, especially for the students and others in the early phases of the research career. On the other hand, PowerPoint presentation is a very common tool to researchers and dental students, so a statistical guide based on PowerPoint could narrow the gap between orthodontist and the Biostatistics. This guide provides objective and useful information about several statistical methods using examples related to the dental field. A Power-Point presentation is employed to assist the user to find answers to common questions regarding Biostatistics, such as the most appropriate statistical test to compare groups, to make correlations and

  11. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    Science.gov (United States)

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  12. Assessment of Problem-Based Learning in the Undergraduate Statistics Course

    Science.gov (United States)

    Karpiak, Christie P.

    2011-01-01

    Undergraduate psychology majors (N = 51) at a mid-sized private university took a statistics examination on the first day of the research methods course, a course for which a grade of "C" or higher in statistics is a prerequisite. Students who had taken a problem-based learning (PBL) section of the statistics course (n = 15) were compared to those…

  13. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    OpenAIRE

    Wheatley, Spencer; Sovacool, Benjamin; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0....

  14. Air-chemistry "turbulence": power-law scaling and statistical regularity

    OpenAIRE

    Hsu, H.-m.; Lin, C.-Y.; Guenther, A.; J. J. Tribbia; Liu, S. C.

    2011-01-01

    With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3) and aerosol (PM10) measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with ...

  15. Statistical analysis of safety related occurences in nuclear power plants. A report of a pilot study

    International Nuclear Information System (INIS)

    The purpose is to study the possibility to use the relatively few data on safety-related incidents for reactors to estimate the hazard rates and give the corresponding confidence bounds. Related variables such as the time to the first accident of a certain class and the statistical properties of such quantities are also considered. Finally some examples using Swedish data are given.(author)

  16. Selection, competency development and assessment of nuclear power plant managers

    International Nuclear Information System (INIS)

    This publication provides information on proven methods and good practices with respect to the selection, development and assessment of nuclear power plant (NPP) managers. The report is organized into four sections, a glossary, two appendices, and several annexes. The Introduction (Section 1) provides the framework for the report. Section 2 describes how appropriate management competencies can be used for the selection, development and assessment of NPP managers, including: -Selection which includes recruitment, promotion and succession management. -Management development programmes including formal training, job rotation, on the job training, mentoring, and outside assignments. -Assessment of individual performance. Section 3 describes a systematic process for identifying the competencies needed by NPP managers. This section culminates in a set of suggested core competencies for NPP managers which are further expanded in Appendix A. The annexes included provide specific examples of competency-based management selection, development, and assessment programmes in several Member States. -Annex A is one method to organize and display competencies. -Annex B is an example of using competencies for selection of first line managers. -Annex C is an example of using management competencies for succession management. -Annexes -H are examples of management development programmes. -Annexes I and J are examples of management assessment programmes. A glossary of terms is provided at the end of the report to explain the use of some key terms explain the use of some key terms

  17. Development of safety assessment of nuclear power plants using indicators

    International Nuclear Information System (INIS)

    The study is based on an indicator system which is under development at the Radiation and Nuclear Safety Authority (STUK). The goal of this study was to define and develop both PSA-based indicators and indicators from failure statistics. As PSA-based indicators the possibility was studied to define and express the risk importance of exemptions from the Technical Specifications, failures, preventive maintenance and other disconnections of devices covered by the Technical Specifications, operating events covered by Guide YVL 1.5 and plant modifications. In this piece of research the applicability of plant specific living PSA-models used for calculation of indicators was examined. The research included both Loviisa and Olkiluoto nuclear power plants in Finland

  18. Preliminary nuclear power reactor technology qualitative assessment for Malaysia

    International Nuclear Information System (INIS)

    Since the worlds first nuclear reactor major breakthrough in December 02, 1942, the nuclear power industry has undergone tremendous development and evolution for more than half a century. After surpassing moratorium of nuclear power plant construction caused by catastrophic accidents at Three-mile island (1979) and Chernobyl (1986), today, nuclear energy is back on the policy agendas of many countries, both developed and developing, signaling nuclear revival or nuclear renaissance. Selection of suitable nuclear power technology has thus been subjected to primary attention. This short paper attempts to draw preliminary technology assessment for the first nuclear power reactor technology for Malaysia. Methodology employed is qualitative analysis collating recent finding of tnb-kepco preliminary feasibility study for nuclear power program in peninsular malaysia and other published presentations and/or papers by multiple experts. The results suggested that pressurized water reactor (PWR) is the prevailing technology in terms of numbers and plant performances, and while the commercialization of generation IV reactors is remote (e.g. Not until 2030), generation III/ III+ NPP models are commercially available on the market today. Five (5) major steps involved in reactor technology selection were introduced with a focus on introducing important aspects of selection criteria. Three (3) categories for the of reactor technology selection were used for the cursory evaluation. The outcome of these analyses shall lead to deeper and full analyses of the recommended reactor technologies for a comprehensive feasibility study in the near future. Recommendations for reactor technology option were also provided for both strategic and technical recommendations. The paper shall also implore the best way to select systematically the first civilian nuclear power reactor. (Author)

  19. The power of 41%: A glimpse into the life of a statistic.

    Science.gov (United States)

    Tanis, Justin

    2016-01-01

    "Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record PMID:27380151

  20. Online Sensor Calibration Assessment in Nuclear Power Systems

    International Nuclear Information System (INIS)

    Safe, efficient, and economic operation of nuclear systems (nuclear power plants, fuel fabrication and storage, used fuel processing, etc.) relies on transmission of accurate and reliable measurements. During operation, sensors degrade due to age, environmental exposure, and maintenance interventions. Sensor degradation can affect the measured and transmitted signals, including sensor failure, signal drift, sensor response time, etc. Currently, periodic sensor recalibration is performed to avoid these problems. Sensor recalibration activities include both calibration assessment and adjustment (if necessary). In nuclear power plants, periodic recalibration of safety-related sensors is required by the plant technical specifications. Recalibration typically occurs during refueling outages (about every 18 to 24 months). Non-safety-related sensors also undergo recalibration, though not as frequently. However, this approach to maintaining sensor calibration and performance is time-consuming and expensive, leading to unnecessary maintenance, increased radiation exposure to maintenance personnel, and potential damage to sensors. Online monitoring (OLM) of sensor performance is a non-invasive approach to assess instrument calibration. OLM can mitigate many of the limitations of the current periodic recalibration practice by providing more frequent assessment of calibration and identifying those sensors that are operating outside of calibration tolerance limits without removing sensors or interrupting operation. This can support extended operating intervals for unfaulted sensors and target recalibration efforts to only degraded sensors

  1. Real-time determination of total radiated power by bolometric cameras with statistical methods

    International Nuclear Information System (INIS)

    A simpler and faster method for determining the total radiated power emitted from a tokamak plasma in real-time has been developed. This quantity is normally calculated after the discharge by a deconvolution of line integrals from a bolometer camera. This time-consuming algorithm assumes constant emissivity on closed flux surfaces and therefore needs the exact magnetic equilibrium information. Thus, it is highly desirable to have a different, simpler way to determine the total radiated power in real-time without additional magnetic equilibrium information. The real-time calculation of the total radiated power is done by a summation over ten or 18 lines of sight selected out of a bolometer camera with 40 channels. The number of channels is restricted by the summation hardware. A new selection scheme, which uses a singular value decomposition, has been developed to select the required subset of line integrals from the camera. With this subset, a linear regression analysis was done against the radiated power calculated by the conventional algorithm. The selected channels are finally used with the regression coefficients as weighting factors to determine an estimation of the radiated power for subsequent discharges. This selection and the corresponding weighting factors can only be applied to discharges with a similar plasma shape, e.g., in our case the typical ASDEX upgrade elliptical divertor plasma. copyright 1998 American Institute of Physics

  2. Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.

    2012-08-01

    Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.

  3. Statistical analysis of the behaviour of the mechanical equipment of EDFs power plants - evaluation of the availability and safety of thermal and nuclear units

    International Nuclear Information System (INIS)

    The investigation and research directorate of EDF has undertaken a statistical analysis of the behaviour of large mechanical equipment at conventional power stations during the ten years following the operating reports of these stations. It has thus been possible to determine the intrinsic reliability, the failure rate, the mean repair time, and the mean good operating time of feed water reheating points, power turbines, pumps and boilers of the various EDF plants (125 and 250 MW) leading to a consideration of the feasibility of an extrapolation to present and future plants. Based on these elementary investigation two methods of calculation have been developed. One is used to assess the overall availability of a thermal or nuclear power station based on the knowledge of the failure rates of the equipment, each piece of equipment being associated with an idea of its technical importance in the functioning of the equipment. A numerical application is given for 125 and 250 MW conventional plants. The purpose of the other method is to estimate the operational safety of the safety equipment of nuclear power stations, based on the development of tree diagrams for faults in basic equipment. A numerical example is given for the cooling systems for Phenix and for one of the Super Phenix versions. (author)

  4. Developing new methodology for nuclear power plants vulnerability assessment

    International Nuclear Information System (INIS)

    Research highlights: → Paper presents new methodology for vulnerability assessment of nuclear power plants. → First universal quantitative risks assessment model for terrorist attack on a NPPs. → New model enhance security, reliability and safe operation of all energy infrastructure. → Significant research benefits: increased NPPs security, reliability and availability. → Useful new tool for PRA application to evaluation of terrorist threats on NPPs. - Abstract: The fundamental aim of an efficient regulatory emergency preparedness and response system is to provide sustained emergency readiness and to prevent emergency situations and accidents. But when an event occurs, the regulatory mission is to mitigate consequences and to protect people and the environment against nuclear and radiological damage. The regulatory emergency response system, which would be activated in the case of a nuclear and/or radiological emergency and release of radioactivity to the environment, is an important element of a comprehensive national regulatory system of nuclear and radiation safety. In the past, national emergency systems explicitly did not include vulnerability assessments of the critical nuclear infrastructure as an important part of a comprehensive preparedness framework. But after the huge terrorist attack on 11/09/2001, decision-makers became aware that critical nuclear infrastructure could also be an attractive target to terrorism, with the purpose of using the physical and radioactive properties of the nuclear material to cause mass casualties, property damage, and detrimental economic and/or environmental impacts. The necessity to evaluate critical nuclear infrastructure vulnerability to threats like human errors, terrorist attacks and natural disasters, as well as preparation of emergency response plans with estimation of optimized costs, are of vital importance for assurance of safe nuclear facilities operation and national security. In this paper presented

  5. Assessment of environmental external effects in power generation

    International Nuclear Information System (INIS)

    This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetize the external effects in the production of energy, especially in relation to renewable technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared to the production of electricity based on a coal-fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas. In the report the individual externalities from the different ways of producing energy are identified, the stress caused by the effect is assessed, and finally the monetary value of the damage is estimated. The method is applied to the local as well as the regional and global externalities. (au) 8 tabs., 7 ills., 4 refs

  6. Quadrennial Technology Review 2015: Technology Assessments--Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2015-10-07

    Wind power has become a mainstream power source in the U.S. electricity portfolio, supplying 4.9% of the nation’s electricity demand in 2014. With more than 65 GW installed across 39 states at the end of 2014, utility-scale wind power is a cost-effective source of low-emissions power generation throughout much of the nation. The United States has significant sustainable land-based and offshore wind resource potential, greater than 10 times current total U.S. electricity consumption. A technical wind resource assessment conducted by the Department of Energy (DOE) in 2009 estimated that the land-based wind energy potential for the contiguous United States is equivalent to 10,500 GW capacity at 80 meters (m) hub and 12,000 GW capacity at 100 meters (m) hub heights, assuming a capacity factor of at least 30%. A subsequent 2010 DOE report estimated the technical offshore wind energy potential to be 4,150 GW. The estimate was calculated from the total offshore area within 50 nautical miles of shore in areas where average annual wind speeds are at least 7 m per second at a hub height of 90 m.

  7. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-11-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  8. An aging assessment of transformers in 1E power systems

    International Nuclear Information System (INIS)

    This paper presents the findings of the Idaho National Engineering Laboratory (INEL) Phase 1 study of the effects of age on nuclear power plant Class 1E power transformers, the significance of transformer aging on plant safety, and the capabilities to mitigate the effects of transformer aging to prevent risk significant failures. The following areas were included in the INEL study: The characteristics of Class 1E power transformers used in nuclear power plants were determined. All known transformer stressors were identified, their effect on transformer materials determined, and the effects of these stressors with time assessed. On- and off-line techniques to detect the degradation of all types of transformers were identified and their effectiveness examined. The information provided in reactor operation data bases was evaluated. The capability of various recommendations to mitigate the effects of transformer aging and prevent unexpected transformer failures was evaluated. The risk significance of transformer aging was determined. Conclusions have been made on the effects of aging on transformer performance, the capability to prevent/detect/mitigate aging effects prior to transformer failure, and the effectiveness of current maintenance and monitoring methods

  9. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-01-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  10. Performance assessment of topologically diverse power systems subjected to hurricane events

    International Nuclear Information System (INIS)

    Large tropical cyclones cause severe damage to major cities along the United States Gulf Coast annually. A diverse collection of engineering and statistical models are currently used to estimate the geographical distribution of power outage probabilities stemming from these hurricanes to aid in storm preparedness and recovery efforts. Graph theoretic studies of power networks have separately attempted to link abstract network topology to transmission and distribution system reliability. However, few works have employed both techniques to unravel the intimate connection between network damage arising from storms, topology, and system reliability. This investigation presents a new methodology combining hurricane damage predictions and topological assessment to characterize the impact of hurricanes upon power system reliability. Component fragility models are applied to predict failure probability for individual transmission and distribution power network elements simultaneously. The damage model is calibrated using power network component failure data for Harris County, TX, USA caused by Hurricane Ike in September of 2008, resulting in a mean outage prediction error of 15.59% and low standard deviation. Simulated hurricane events are then applied to measure the hurricane reliability of three topologically distinct transmission networks. The rate of system performance decline is shown to depend on their topological structure. Reliability is found to correlate directly with topological features, such as network meshedness, centrality, and clustering, and the compact irregular ring mesh topology is identified as particularly favorable, which can influence regional lifeline policy for retrofit and hardening activities to withstand hurricane events.

  11. OVERVIEW OF ENVIRONMENTAL ASSESSMENT FOR CHINA NUCLEAR POWER INDUSTRY AND COAL—FIRED POWER INDUSTRY

    Institute of Scientific and Technical Information of China (English)

    张少华; 潘自强; 等

    1994-01-01

    A quantitative environmental assessment method and the corresponding computer code are introduced in this paper.By the consideration of all fuel cycle steps,it gives that the public health risk of China nuclear power industry is 5.2×10-1man/(GW.a),the occupational health risk is 2.5man/(GW.a).and the total health risk is 3.0man/(GW.a0.After the health risk calculation for coal mining,transport,burning up and ash disposal,it gives that the public health risk of China cola-fired power industry is 3.6man/(GW.a).the occupational health risk is 50man/(GW.a),and the total is 54man/(GW.a),Accordingly,the conclusion that China nuclear power industry is an industry with high safety and cleanness is derived at the end.

  12. Overview of environmental assessment for China nuclear power industry and coal-fired power industry

    International Nuclear Information System (INIS)

    A quantitative environmental assessment method and the corresponding computer code are introduced. By the consideration of all fuel cycle steps, it given that the public health risk of China nuclear power industry is 5.2 x 10-1 man/(GW·a) the public health risk is 2.5 man/(GW·a), and the total health risk is 3.0 man/(GW·a). After the health risk calculation for coal mining, transport, burning up and ash disposal, it gives that the public health risk of China coal-fired power industry is 3.6 man/(GW·a), the occupational health risk is 50 man/(GW·a), and the total is 54 man/(GW·). Accordingly, the conclusion that China nuclear power industry is one with high safety and cleanness is derived at the end

  13. Climatic change of summer temperature and precipitation in the Alpine region - a statistical-dynamical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Heimann, D.; Sept, V.

    1998-12-01

    Climatic changes in the Alpine region due to increasing greenhouse gas concentrations are assessed by using statistical-dynamical downscaling. The downscaling procedure is applied to two 30-year periods (1971-2000 and 2071-2100, summer months only) of the output of a transient coupled ocean/atmosphere climate scenario simulation. The downscaling results for the present-day climate are in sufficient agreement with observations. The estimated regional climate change during the next 100 years shows a general warming. The mean summer temperatures increase by about 3 to more than 5 Kelvin. The most intense climatic warming is predicted in the western parts of the Alps. The amount of summer precipitation decreases in most parts of central Europe by more than 20 percent. Only over the Adriatic area and parts of eastern central Europe an increase in precipitation is simulated. The results are compared with observed trends and results of regional climate change simulations of other authors. The observed trends and the majority of the simulated trends agree with our results. However, there are also climate change estimates which completely contradict with ours. (orig.) 29 refs.

  14. Using Statistical and Probabilistic Methods to Evaluate Health Risk Assessment: A Case Study

    Directory of Open Access Journals (Sweden)

    Hongjing Wu

    2014-06-01

    Full Text Available The toxic chemical and heavy metals within wastewater can cause serious adverse impacts on human health. Health risk assessment (HRA is an effective tool for supporting decision-making and corrective actions in water quality management. HRA can also help people understand the water quality and quantify the adverse effects of pollutants on human health. Due to the imprecision of data, measurement error and limited available information, uncertainty is inevitable in the HRA process. The purpose of this study is to integrate statistical and probabilistic methods to deal with censored and limited numbers of input data to improve the reliability of the non-cancer HRA of dermal contact exposure to contaminated river water by considering uncertainty. A case study in the Kelligrews River in St. John’s, Canada, was conducted to demonstrate the feasibility and capacity of the proposed approach. Five heavy metals were selected to evaluate the risk level, including arsenic, molybdenum, zinc, uranium and manganese. The results showed that the probability of the total hazard index of dermal exposure exceeding 1 is very low, and there is no obvious evidence of risk in the study area.

  15. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  16. Statistical assessment of the 137Cs levels of the Chernihiv oblast's milk

    International Nuclear Information System (INIS)

    The article deals with research directed on overcoming the consequences of the Chornobyl accident at the territory of Ukraine. Results are considered of the use of the long-normal distribution law to evaluate results of 137Cs milk contamination. Critical farms of Chernihiv oblast, where agreeing criteria for assessing the primary data on milk contamination were applied, became the object of the study. An algorithm was applied to calculate factual and forecast repetitions of gradations according to the stages of statistical processing of milk samples contaminated with 137Cs. Results of the milk contamination analysis at a later stage (1991-2001)are described by the long-normal distribution law which can be used to forecast for the subsequent years. The maximum repetability of the gradations of the contaminated milk (from 10 to 40 Bq/l) is determined for factual and forecast frequencies of the levels of contamination thereof. The results of the study are proposed to be used while taking measures directed on diminishing the levels of contamination of agricultural products with 137Cs

  17. POWER LOSSES ASSESSMENT IN TRANSFORMERS AFTER THE NORMATIVE OPERATING PERIOD

    Directory of Open Access Journals (Sweden)

    M. I. Fursanov

    2015-01-01

    Full Text Available The capacity losses values both loading and off-load are topmost parameters characterizing the distribution mains customers’ transformers operating effectiveness. Precise determination of the specified values facilitates substantiated choice of the optimizing procedures. The actuality of the given topic increases owing to the fact that the modern electric grid utilizes plenty of the oil-transformers whose time in commission considerably exceeds the statutory 25 years. Under the conditions of continued operation the power-losses measurement according to the functioning guidelines does not seem always possible.The authors present an improved power-losses assessment technique based on the currently accepted thermal model of the oil-transformer. They indicate the deficiency of the existing technique and substantiate some of the changes in practical application of the mathematical model. The article makes an emphasis on peculiarities of the temperature changes in the oil-transformer and offers a prototype device of open architecture for realizing the improved technique of the power-losses measurement. The paper describes the device design features and functionality options and depicts its sketchy schematic. The authors note the potential of additional to assessing the power-losses volume, transmitting the obtained information to the dispatcher  via  GSM-connection  for  simplification  of  the  transformer  status  monitoring; as well as the capability of integrating the device into the system of the transformer thermal protection. The practical merit and application scope of the obtained results are in development and choice of the optimizing measures to be taken in the distributive electrical grids, e. g. the transformer replacement.

  18. Assessment of control rooms of nuclear power plants

    International Nuclear Information System (INIS)

    To identify and correct the lacks in control rooms of operating power plants and plants under construction an extensive program has been started in the USA. In Finland as in other countries using nuclear power, the development in the USA particularly with regard to the requirements imposed on nuclear power plants is carefully followed. The changes in these requirements are sooner or later also reflected in the guidelines given by the Finnish authorities. It is therefore important to be able to form a notion of how the new requirements apply to Finnish conditions. Especially it is important to review the latest assessment guidelines for control room implementation (NUREG-0700). Thus we can avoid possible over hasty conclusions. The aim of the analysis of the method and experiments presented in NUREG 0700 report was to create a basis for assessment of the suitability of the method for Finnish control room implementation. The task group has made a general methodical analysis of the method, and partly tried it in assessment of the TVO2 control room. It is obvious that direct conclusions from the American situation are misleading. It can be considered unfeasible to follow the American requirements as such, because they can lead to unwanted results. If the review is limited to control room details, the NRC program (checklist) can be considered successful. It can also be used during planning to observation of small discrepancies. However, we can question the applicability of some requirements. It is, though, more essential that the control room entity has neither in this nor in several other programs been reached or standardized. In spite of the difficulties we should try to reach this most important goal. (author)

  19. Assessment of tritium breeding requirements for fusion power reactors

    International Nuclear Information System (INIS)

    This report presents an assessment of tritium-breeding requirements for fusion power reactors. The analysis is based on an evaluation of time-dependent tritium inventories in the reactor system. The method presented can be applied to any fusion systems in operation on a steady-state mode as well as on a pulsed mode. As an example, the UWMAK-I design was analyzed and it has been found that the startup inventory requirement calculated by the present method significantly differs from those previously calculated. The effect of reactor-parameter changes on the required tritium breeding ratio is also analyzed for a variety of reactor operation scenarios

  20. Planck 2013 results. XXI. All-sky Compton parameter power spectrum and high-order statistics

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Genova-Santos, R.T.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. These maps show an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales ($\\ell 500$) the clustered Cosmic Infrared Background (CIB) and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We measure the tSZ power spectrum in angular scales, $0.17^{\\circ} \\lesssim \\theta \\lesssim 3.0^{\\circ}$, that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with additional clear evidence of signal from unresolved clusters and, potentially, diffuse warm baryons. We use the tSZ power spectrum to ...

  1. Point Processes Modeling of Time Series Exhibiting Power-Law Statistics

    CERN Document Server

    Kaulakys, B; Gontis, V

    2010-01-01

    We consider stochastic point processes generating time series exhibiting power laws of spectrum and distribution density (Phys. Rev. E 71, 051105 (2005)) and apply them for modeling the trading activity in the financial markets and for the frequencies of word occurrences in the language.

  2. Semi-autonomous competency assessment of powered mobility device users.

    Science.gov (United States)

    Miro, Jaime Valls; Black, Ross; De Bruijn, Freek; Dissanayake, Dissanayake

    2011-01-01

    This paper describes a stand-alone sensor package and algorithms for aiding the assessment by an occupational therapist whether a person has the capacity to safely and effectively operate a powered mobility device such as a walking aid or a wheelchair. The sensor package employed consists of a laser range finder, an RGB camera and an inertial measurement unit that can be attached to any mobility device with minimal modifications. Algorithms for capturing the data received by the sensor package and for generating the map of the environment as well as the trajectory of the mobility device have been developed. Such information presents occupational therapists with the capability to provide a quantitative assessment of whether patients are ready to be safely deployed with mobile aids for their daily activities. Preliminary evaluation of the sensor package and associated algorithms based on experiments, conducted at the premises of the Prince of Wales Hospital in Sydney, are presented. PMID:22275568

  3. Power plant system assessment. Final report. SP-100 Program

    International Nuclear Information System (INIS)

    The purpose of this assessment was to provide system-level insights into 100-kWe-class space reactor electric systems. Using these insights, Rockwell was to select and perform conceptual design studies on a ''most attractive'' system that met the preliminary design goals and requirements of the SP-100 Program. About 4 of the 6 months were used in the selection process. The remaining 2 months were used for the system conceptual design studies. Rockwell completed these studies at the end of FY 1983. This report summarizes the results of the power plant system assessment and describes our choice for the most attractive system - the Rockwell SR-100G System (Space Reactor, 100 kWe, Growth) - a lithium-cooled UN-fueled fast reactor/Brayton turboelectric converter system

  4. Security assessment for intentional island operation in modern power system

    DEFF Research Database (Denmark)

    Chen, Yu; Xu, Zhao; Østergaard, Jacob

    2011-01-01

    operator can clearly know if it is suitable to conduct island operation at one specific moment. Besides, in order to improve the computation efficiency, the Artificial Neural Network (ANN) is applied for fast ISR formation. Thus, online application of ISR based islanding security assessment could...... the emergency in the power system, some distribution networks may be intentionally separated from the main grid to avoid complete system collapse. If DGs in those networks could continuously run instead of immediately being shut down, the blackout could be avoided and the reliability of supply could...... be increased. However, when to island or how to ensure the islanded systems can survive the islanding transition is uncertain. This article proposes an Islanding Security Region (ISR) concept to provide security assessment of island operation. By comparing the system operating state with the ISR, the system...

  5. Safety assessment of nuclear power plants equipped with VVER reactors

    International Nuclear Information System (INIS)

    The safety studies of the nuclear generating units of Greifswald and Rheinsberg have produced important basic findings for the assessment of nuclear power plants equipped with VVER reactors. Deficits in engineered safeguards design have been found in the three lines, i.e. VVER-440/V-230, VVER-440/V-213, and VVER-1000. The oldest line is thought to be beyond backfitting, while design deficits in the two other lines can largely be corrected by backfitting measures. In Eastern Europe, safety studies are conducted by GRS and IPSN in close cooperation with national authorities. This is demonstrated by the safety assessment of units 1 to 4 on the Kosloduj site. Studies performed on a national level are supported by the German Federal Ministry for the Environment and, internationally, by the European Community. (orig.)

  6. A Framework for Assessing the Commercialization of Photovoltaic Power Generation

    Science.gov (United States)

    Yaqub, Mahdi

    An effective framework does not currently exist with which to assess the viability of commercializing photovoltaic (PV) power generation in the US energy market. Adopting a new technology, such as utility-scale PV power generation, requires a commercialization assessment framework. The framework developed here assesses the economic viability of a set of alternatives of identified factors. Economic viability focuses on simulating the levelized cost of electricity (LCOE) as a key performance measure to realize `grid parity', or the equivalence between the PV electricity prices and grid electricity prices for established energy technologies. Simulation results confirm that `grid parity' could be achieved without the current federal 30% investment tax credit (ITC) via a combination of three strategies: 1) using economies of scale to reduce the LCOE by 30% from its current value of 3.6 cents/kWh to 2.5 cents/kWh, 2) employing a longer power purchase agreement (PPA) over 30 years at a 4% interest rate, and 3) improving by 15% the "capacity factor", which is the ratio of the total annual generated energy to the full potential annual generation when the utility is continuously operating at its rated output. The lower than commercial-market interest rate of 4% that is needed to realize `grid parity' is intended to replace the current federal 30% ITC subsidy, which does not have a cash inflow to offset the outflow of subsidy payments. The 4% interest rate can be realized through two proposed finance plans: The first plan involves the implementation of carbon fees on polluting power plants to produce the capital needed to lower the utility PPA loan term interest rate from its current 7% to the necessary 4% rate. The second plan entails a proposed public debt finance plan. Under this plan, the US Government leverages its guarantee power to issue bonds and uses the proceeds to finance the construction and operation of PV power plants with PPA loan with a 4% interest rate for a

  7. Intrinsic Variability and Field Statistics for the Vela Pulsar: 3. Two-Component Fits and Detailed Assessment of Stochastic Growth Theory

    OpenAIRE

    Cairns, Iver H.; Das, P; P A Robinson; Johnston, S

    2003-01-01

    The variability of the Vela pulsar (PSR B0833-45) corresponds to well-defined field statistics that vary with pulsar phase, ranging from Gaussian intensity statistics off-pulse to approximately power-law statistics in a transition region and then lognormal statistics on-pulse, excluding giant micropulses. These data are analyzed here in terms of two superposed wave populations, using a new calculation for the amplitude statistics of two vectorially-combined transverse fields. Detailed analyse...

  8. Assessment of statistical characteristics of point rainfall in the Onkaparinga catchment in South Australia

    OpenAIRE

    Rashid, M.M.; S. Beecham; Chowdhury, R

    2013-01-01

    Spatial and temporal variations in statistical characteristics of point rainfall are important for rainfall modelling. The main objective of this study was to investigate the statistical characteristics of point rainfall and to identify a probability distribution that can model the full spectrum of daily rainfall in the Onkaparinga catchment in South Australia. Daily rainfall data from 1960 to 2010 at thirteen rainfall stations were considered. Statistical moments and auto...

  9. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    OpenAIRE

    Maia, A.H.N.; Meinke, H.B.; Lennox, S; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of ¿quality.¿ However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, p...

  10. Cost/benefit assessment in electric power systems

    International Nuclear Information System (INIS)

    The basic function of a modern power system is to satisfy the system load requirements as economically as possible and with a reasonable assurance of continuity and quality. The question of what is reasonable can be examined in terms of the costs and the worth to the consumer associated with providing an adequate supply. The process of preparing reliability worth estimates based on customer cost-of-interruption data is presented. These data can be derived for a particular utility service area and are used to determine appropriate customer damage functions. These indicators can be used with the basic loss of energy expectation (LOEE) index to obtain a factor that can be utilized to relate the customer losses to the worth of electric service reliability. This factor is designated as the interrupted energy assessment rate (IEAR). The developed IEAR values can be utilized in both generating capacity and composite generation and transmission system assessment. Methods for using these estimates in power system optimization at the planning stages are described and examples are used to illustrate the procedures. 106 refs., 77 figs., 64 tabs

  11. Assessment of electrical equipment aging for nuclear power plant

    International Nuclear Information System (INIS)

    The electrical and instrumentation equipment, especially whose parts are made of polymer material, are degraded by thermal and radiation environment gradually in the normal operation, and the degradation is thought to progress rapidly when exposed to the environment of the design basis event (DBE). The integrity of the equipment is evaluated by the environmental qualification (EQ) test simulating the environment of the normal operation and the DBE. The outcome of the project of 'Assessment of Cable Aging for Nuclear Power Plants' (ACA, 2002-2008), indicated that it is important to expose the test specimens to both thermal and radiation condition at the same to simulate the aging in normal operation. The project of 'Assessment of Electrical Equipment Aging for Nuclear Power Plants' (AEA) was initiated in FY2008 to apply the outcome of ACA to the other electrical and instrumentation equipment and to establish an advanced EQ test method that can appropriately simulate the environment in actual plants. In FY2010, aging characteristics of thermal aging and simultaneous aging were obtained for the epoxy resin of electrical penetrations and the O-ring of connectors, and the advanced EQ test guide for electrical penetration was established. In addition, aging evaluation tests in accordance with the guide were conducted for electrical penetration simulating the operation of 40, 60, and 80 years. (author)

  12. Assessment of electrical equipment aging for nuclear power plant

    International Nuclear Information System (INIS)

    The electrical and instrumentation equipments, especially whose parts are made of polymer material, are gradually degraded by thermal and radiation environment in the normal operation, and the degradation is thought to progress rapidly when they are exposed to the environment of the design basis event (DBE). The integrity of the equipments is evaluated by the environmental qualification (EQ) test simulating the environment of the normal operation and the DBE. The project of 'Assessment of Cable Aging for Nuclear Power Plants' (ACA, 2002-2008) indicated the importance of applying simultaneous thermal and radiation aging for simulating the aging in normal operation. The project of 'Assessment of Electrical Equipment Aging for Nuclear Power Plants' (AEA) was initiated in FY2008 to apply the outcome of ACA to the other electrical and instrumentation equipment and to establish an advanced EQ test method that can appropriately simulate the environment in actual plants. In FY2012, aging characteristics of thermal aging and simultaneous aging were obtained for the epoxy resin of electrical penetrations and the O-ring of connectors. Physical property measurement was carried out for epoxy resin of electrical penetration subject to the type testing in FY2010. (author)

  13. Assessment of electrical equipment aging for nuclear power plant

    International Nuclear Information System (INIS)

    The electrical and instrumentation equipments, especially whose parts are made of polymer material, are gradually degraded by thermal and radiation environment in the normal operation, and the degradation is thought to progress rapidly when they are exposed to the environment of the design basis event (DBE). The integrity of the equipments is evaluated by the environmental qualification (EQ) test simulating the environment of the normal operation and the DBE. The project of 'Assessment of Cable Aging for Nuclear Power Plants' (ACA, 2002-2008), indicated the importance of the use of simultaneous thermal and radiation aging for simulating the aging in normal operation. The project of 'Assessment of Electrical Equipment Aging for Nuclear Power Plants' (AEA) was initiated in FY2008 to apply the outcome of ACA to the other electrical and instrumentation equipment and to establish an advanced EQ test method that can appropriately simulate the environment in actual plants. In FY2011, aging characteristics of thermal aging and simultaneous aging were obtained for the epoxy resin of electrical penetrations and the O-ring of connectors, and outlines of advanced EQ test guide for valve actuator, connector, junction box, transmitter, thermometer, radiation detector and electrical motor were established. (author)

  14. A statistical simulation model for fiels testing of non-target organisms in environmental risk assessment of genetically modified plants

    OpenAIRE

    Goedhart, P.W.; Voet, van der, E.; Baldacchino, F.; Arpaia, S.

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in...

  15. Use of Non-Parametric Statistical Method in Identifying Repetitive High Dose Jobs in a Nuclear Power Plant

    International Nuclear Information System (INIS)

    The cost-effective reduction of occupational radiation dose (ORD) at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORD data of existing plants. Through the data analysis, it is required to identify what are the jobs of repetitive high ORD at the nuclear power plant. In this study, Percentile Rank Sum Method (PRSM) is proposed to identify repetitive high ORD jobs, which is based on non-parametric statistical theory. As a case study, the method is applied to ORD data of maintenance and repair jobs at Kori units 3 and 4 that are pressurized water reactors with 950 MWe capacity and have been operated since 1986 and 1987, respectively in Korea. The results was verified and validated, and PRSM has been demonstrated to be an efficient method of analyzing the data.

  16. Preliminary environmental assessment for the Satellite Power System (SPS). Revision 1. Volume 2. Detailed assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The Department of Energy (DOE) is considering several options for generating electrical power to meet future energy needs. The satellite power system (SPS), one of these options, would collect solar energy through a system of satellites in space and transfer this energy to earth. A reference system has been described that would convert the energy to microwaves and transmit the microwave energy via directive antennas to large receiving/rectifying antennas (rectennas) located on the earth. At the rectennas, the microwave energy would be converted into electricity. The potential environmental impacts of constructing and operating the satellite power system are being assessed as a part of the Department of Energy's SPS Concept Development and Evaluation Program. This report is Revision I of the Preliminary Environmental Assessment for the Satellite Power System published in October 1978. It refines and extends the 1978 assessment and provides a basis for a 1980 revision that will guide and support DOE recommendations regarding future SPS development. This is Volume 2 of two volumes. It contains the technical detail suitable for peer review and integrates information appearing in documents referenced herein. The key environmental issues associated with the SPS concern human health and safety, ecosystems, climate, and electromagnetic systems interactions. In order to address these issues in an organized manner, five tasks are reported: (I) microwave-radiation health and ecological effects; (II) nonmicrowave health and ecological effectss; (III) atmospheric effects; (IV) effects on communication systems due to ionospheric disturbance; and (V) electromagnetic compatibility. (WHK)

  17. Aging assessment of surge protective devices in nuclear power plants

    International Nuclear Information System (INIS)

    An assessment was performed to determine the effects of aging on the performance and availability of surge protective devices (SPDs), used in electrical power and control systems in nuclear power plants. Although SPDs have not been classified as safety-related, they are risk-important because they can minimize the initiating event frequencies associated with loss of offsite power and reactor trips. Conversely, their failure due to age might cause some of those initiating events, e.g., through short circuit failure modes, or by allowing deterioration of the safety-related component(s) they are protecting from overvoltages, perhaps preventing a reactor trip, from an open circuit failure mode. From the data evaluated during 1980--1994, it was found that failures of surge arresters and suppressers by short circuits were neither a significant risk nor safety concern, and there were no failures of surge suppressers preventing a reactor trip. Simulations, using the ElectroMagnetic Transients Program (EMTP) were performed to determine the adequacy of high voltage surge arresters

  18. Aging assessment of surge protective devices in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.F.; Subudhi, M. [Brookhaven National Lab., Upton, NY (United States); Carroll, D.P. [Florida Univ., Gainesville, FL (United States)

    1996-01-01

    An assessment was performed to determine the effects of aging on the performance and availability of surge protective devices (SPDs), used in electrical power and control systems in nuclear power plants. Although SPDs have not been classified as safety-related, they are risk-important because they can minimize the initiating event frequencies associated with loss of offsite power and reactor trips. Conversely, their failure due to age might cause some of those initiating events, e.g., through short circuit failure modes, or by allowing deterioration of the safety-related component(s) they are protecting from overvoltages, perhaps preventing a reactor trip, from an open circuit failure mode. From the data evaluated during 1980--1994, it was found that failures of surge arresters and suppressers by short circuits were neither a significant risk nor safety concern, and there were no failures of surge suppressers preventing a reactor trip. Simulations, using the ElectroMagnetic Transients Program (EMTP) were performed to determine the adequacy of high voltage surge arresters.

  19. Modular power system topology assessment using Gaussian potential functions

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Lagos, F.; Joya, G.; Sandoval, F. [Universidad de Malaga, ETSI Telecomunicacion (Spain). Dpto. Tecnologia Electronica; Marin, F.J. [Universidad de Malaga, ETSI Informatica (Spain). Dpto. Electronica

    2003-09-01

    A topology assessment system for power systems using active power measurements as input data is presented. The method is designed to be incorporated into a state estimator working with a bus-branch orientated network model. The system architecture contains two states: (i) the preprocessing state; and (ii) the classification stage. The preprocessing stage transforms each current measurement set to produce a vector in the [0.1]{sup n} space. This stage produces clusters of very similar preprocessing output vectors for each grid topology. The classification stage consists in a layer of Gaussian potential units with Mahalanobis distance, and classifies the preprocessing output vectors to identify the actual topology. The main features of this method are: (i) local topology identification; (ii) linear growth of the complexity with the power system size; (iii) correction of multiple errors; and (iv) insensitivity to bad data. Tests have been carried out using the IEEE 14, 30, 57, 118 and 300 standard networks and different topological and measurement configurations. These tests have demonstrated the successful application of the technique. (Author)

  20. Toward a No-Reference Image Quality Assessment Using Statistics of Perceptual Color Descriptors.

    Science.gov (United States)

    Lee, Dohyoung; Plataniotis, Konstantinos N

    2016-08-01

    Analysis of the statistical properties of natural images has played a vital role in the design of no-reference (NR) image quality assessment (IQA) techniques. In this paper, we propose parametric models describing the general characteristics of chromatic data in natural images. They provide informative cues for quantifying visual discomfort caused by the presence of chromatic image distortions. The established models capture the correlation of chromatic data between spatially adjacent pixels by means of color invariance descriptors. The use of color invariance descriptors is inspired by their relevance to visual perception, since they provide less sensitive descriptions of image scenes against viewing geometry and illumination variations than luminances. In order to approximate the visual quality perception of chromatic distortions, we devise four parametric models derived from invariance descriptors representing independent aspects of color perception: 1) hue; 2) saturation; 3) opponent angle; and 4) spherical angle. The practical utility of the proposed models is examined by deploying them in our new general-purpose NR IQA metric. The metric initially estimates the parameters of the proposed chromatic models from an input image to constitute a collection of quality-aware features (QAF). Thereafter, a machine learning technique is applied to predict visual quality given a set of extracted QAFs. Experimentation performed on large-scale image databases demonstrates that the proposed metric correlates well with the provided subjective ratings of image quality over commonly encountered achromatic and chromatic distortions, indicating that it can be deployed on a wide variety of color image processing problems as a generalized IQA solution. PMID:27305678

  1. Using Innovative Statistical Analyses to Assess Soil Degradation due to Land Use Change

    Science.gov (United States)

    Khaledian, Yones; Kiani, Farshad; Ebrahimi, Soheila; Brevik, Eric C.; Aitkenhead-Peterson, Jacqueline

    2016-04-01

    Soil erosion and overall loss of soil fertility is a serious issue for loess soils of the Golestan province, northern Iran. The assessment of soil degradation at large watershed scales is urgently required. This research investigated the role of land use change and its effect on soil degradation in cultivated, pasture and urban lands, when compared to native forest in terms of declines in soil fertility. Some novel statistical methods including partial least squares (PLS), principal component regression (PCR), and ordinary least squares regression (OLS) were used to predict soil cation-exchange capacity (CEC) using soil characteristics. PCA identified five primary components of soil quality. The PLS model was used to predict soil CEC from the soil characteristics including bulk density (BD), electrical conductivity (EC), pH, calcium carbonate equivalent (CCE), soil particle density (DS), mean weight diameter (MWD), soil porosity (F), organic carbon (OC), Labile carbon (LC), mineral carbon, saturation percentage (SP), soil particle size (clay, silt and sand), exchangeable cations (Ca2+, Mg2+, K+, Na+), and soil microbial respiration (SMR) collected in the Ziarat watershed. In order to evaluate the best fit, two other methods, PCR and OLS, were also examined. An exponential semivariogram using PLS predictions revealed stronger spatial dependence among CEC [r2 = 0.80, and RMSE= 1.99] than the other methods, PCR [r2 = 0.84, and RMSE= 2.45] and OLS [r2 = 0.84, and RMSE= 2.45]. Therefore, the PLS method provided the best model for the data. In stepwise regression analysis, MWD and LC were selected as influential variables in all soils, whereas the other influential parameters were different in various land uses. This study quantified reductions in numerous soil quality parameters resulting from extensive land-use changes and urbanization in the Ziarat watershed in Northern Iran.

  2. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    Science.gov (United States)

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  3. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd;

    2009-01-01

    the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform on...

  4. Enhancing an Undergraduate Business Statistics Course: Linking Teaching and Learning with Assessment Issues

    Science.gov (United States)

    Fairfield-Sonn, James W.; Kolluri, Bharat; Rogers, Annette; Singamsetti, Rao

    2009-01-01

    This paper examines several ways in which teaching effectiveness and student learning in an undergraduate Business Statistics course can be enhanced. First, we review some key concepts in Business Statistics that are often challenging to teach and show how using real data sets assist students in developing deeper understanding of the concepts.…

  5. An Assessment of LAC's Vital Statistics System : The Foundation of Maternal and Infant Mortality Monitoring

    OpenAIRE

    Danel, Isabella; Bortman, Marcelo

    2008-01-01

    Vital records, the registration of births, deaths, marriages and divorces, and the vital statistics derived from these records serve two important purposes. Firstly, vital records are legal documents, but the focus of this review, is the role of vital records to create demographic and epidemiological statistics that are used in monitoring trends and developing health policies and programs....

  6. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must al

  7. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment.

    Science.gov (United States)

    Henriksson, Patrik J G; Rico, Andreu; Zhang, Wenbo; Ahmad-Al-Nahid, Sk; Newton, Richard; Phan, Lam T; Zhang, Zongfeng; Jaithiang, Jintana; Dao, Hai M; Phu, Tran M; Little, David C; Murray, Francis J; Satapornvanit, Kriengkrai; Liu, Liping; Liu, Qigen; Haque, M Mahfujul; Kruijssen, Froukje; de Snoo, Geert R; Heijungs, Reinout; van Bodegom, Peter M; Guinée, Jeroen B

    2015-12-15

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aquatic food products. Our starting hypothesis was that different production systems are associated with significantly different environmental impacts, as the production of these aquatic species differs in intensity and management practices. In order to test this hypothesis, we estimated each system's global warming, eutrophication, and freshwater ecotoxicity impacts. The contribution to these impacts and the overall dispersions relative to results were propagated by Monte Carlo simulations and dependent sampling. Paired testing showed significant (p shrimp did more than 95% of the propagated Monte Carlo results favor certain farming systems. The major environmental hot-spots driving the differences in environmental performance among systems were fishmeal from mixed fisheries for global warming, pond runoff and sediment discards for eutrophication, and agricultural pesticides, metals, benzalkonium chloride, and other chlorine-releasing compounds for freshwater ecotoxicity. The Asian aquaculture industry should therefore strive toward farming systems relying upon pelleted species-specific feeds, where the fishmeal inclusion is limited and sourced sustainably. Also, excessive nutrients should be recycled in integrated organic agriculture together with efficient aeration solutions powered by renewable energy sources. PMID:26512735

  8. Techno-economic assessment of thorium power in Canada

    International Nuclear Information System (INIS)

    Highlights: • Costs of replacing uranium in Canada’s nuclear reactors with thorium evaluated. • Results show a thorium plant to be more financially lucrative than a uranium plant. • Results were most sensitive to electricity price, then capital and decommissioning cost. • Abatement cost analysis showed nuclear power offers cost savings over fossil fuels. - Abstract: Thorium fission is a large yet relatively unexplored renewable energy source and could help feed increasing energy demands. An analysis was performed on the feasibility of replacing the uranium in Canada’s nuclear reactors with thorium. Thorium only exists as a fertile isotope, and so an external fissile source such as 235U, 233U, or 239Pu is required to stimulate the fission process. A uranium plant and a similar thorium-fuelled plant were compared over a 40 year operational life based on a comprehensive economic analysis. The results from the economic analysis were used to estimate the greenhouse gas (GHG) abatement cost compared to the coal and natural gas-based power. The economic analysis determined that a thorium plant is more financially lucrative in Canada than a uranium plant. An abatement cost assessment in relation to gas-fired and coal-fired power plants demonstrated that nuclear power offers a cost savings per tonne of CO2 equivalent greenhouse gas (GHG) when compared to both fossil fuel alternatives. From the values determined for a plant potentially fuelled on thorium, the abatement cost when compared to the coal-fired and gas-fired plants is −$10.4/tonne-CO2eq and −$15.7/tonne-CO2eq, respectively

  9. Assessing the economic wind power potential in Austria

    International Nuclear Information System (INIS)

    In the European Union, electricity production from wind energy is projected to increase by approximately 16% until 2020. The Austrian energy plan aims at increasing the currently installed wind power capacity from approximately 1 GW to 3 GW until 2020 including an additional capacity of 700 MW until 2015. The aim of this analysis is to assess economically viable wind turbine sites under current feed-in tariffs considering constraints imposed by infrastructure, the natural environment and ecological preservation zones in Austria. We analyze whether the policy target of installing an additional wind power capacity of 700 MW until 2015 is attainable under current legislation and developed a GIS based decision system for wind turbine site selection.Results show that the current feed-in tariff of 9.7 ct kW h−1 may trigger an additional installation of 3544 MW. The current feed-in tariff can therefore be considered too high as wind power deployment would exceed the target by far. Our results indicate that the targets may be attained more cost-effectively by applying a lower feed-in tariff of 9.1 ct kW h−1. Thus, windfall profits at favorable sites and deadweight losses of policy intervention can be minimized while still guaranteeing the deployment of additional wind power capacities. - Highlight: ► Wind supply curves with high spatial resolution for whole Austria are derived. ► Current feed-in tariff higher than necessary to attain targets. Previous feed-in tariffs were too low to achieve targets. ► Current support scheme leads to high social welfare losses. ► Policy makers face high information asymmetry when setting feed-in tariffs.

  10. Network Theory Integrated Life Cycle Assessment for an Electric Power System

    Directory of Open Access Journals (Sweden)

    Heetae Kim

    2015-08-01

    Full Text Available In this study, we allocate Greenhouse gas (GHG emissions of electricity transmission to the consumers. As an allocation basis, we introduce energy distance. Energy distance takes the transmission load on the electricity energy system into account in addition to the amount of electricity consumption. As a case study, we estimate regional GHG emissions of electricity transmission loss in Chile. Life cycle assessment (LCA is used to estimate the total GHG emissions of the Chilean electric power system. The regional GHG emission of transmission loss is calculated from the total GHG emissions. We construct the network model of Chilean electric power grid as an undirected network with 466 nodes and 543 edges holding the topology of the power grid based on the statistical record. We analyze the total annual GHG emissions of the Chilean electricity energy system as 23.07 Mt CO2-eq. and 1.61 Mt CO2-eq. for the transmission loss, respectively. The total energy distance for the electricity transmission accounts for 12,842.10 TWh km based on network analysis. We argue that when the GHG emission of electricity transmission loss is estimated, the electricity transmission load should be separately considered. We propose network theory as a useful complement to LCA analysis for the complex allocation. Energy distance is especially useful on a very large-scale electric power grid such as an intercontinental transmission network.

  11. Investigation and assessment of tritium concentration of aquatic environment surrounding haiyang nuclear power plant

    International Nuclear Information System (INIS)

    Objective: To investigate tritium concentrations of aquatic environment surrounding Haiyang nuclear power plant, and make a analysis of the influencial factors of the tritium concentration; to assess the accumulated-effective dose of the residents surrounding nuclear power plant. Methods: We collected 16 sample points, including surface water, groundwater, drinking water and sea water within 30 km surrounding Haiyang nuclear power plant in wet period and dry period. The pretreatment and preparation of samples referred to the recommended methods of the national standards GB 12375-90. The low background liquid scintillation spectrometer is used to measure the tritium concentration. Result: The average level of the tritium concentration of water samples was (0.62 ± 0.163) Bq · L-1, the range of the tritium concentrations was from 0.27Bq · L-1 to 0.93Bq · L-1. The difference of the tritium concentrations between two different periods analyzed by the paired t test was considered statistically significant. (P-1, 0.008 μ Sv · a-1, 0.007 μ Sv · a-1, respectively. Conclusion: The activity concentration of tritium in the aquatic environment surrounding Haiyang nuclear power plant was at the lower level than that of others; according to the limited value that is regulated by basic standards for protection against ionizing radiation and of the safety of radiation sources (GB 18871-2002) (2 mSv), the accumulated-effective dose which residents suffered was in background level of radiation. (authors)

  12. Selection for Environmental Variation: A Statistical Analysis and Power Calculations to Detect Response

    OpenAIRE

    Ibáñez-Escriche, Noelia; Sorensen, Daniel; Waagepetersen, Rasmus; Blasco, Agustín

    2008-01-01

    Data from uterine capacity in rabbits (litter size) were analyzed to determine whether the environmental variance was partly genetically determined. The fit of a classical homogeneous variance mixed linear (HOM) model and that of a genetically structured heterogeneous variance mixed linear (HET) model were compared. Various methods to assess the quality of fit favor the HET model. The posterior mean (95% posterior interval) of the additive genetic variance affecting the environmental variance...

  13. Assessing Colour-dependent Occupation Statistics Inferred from Galaxy Group Catalogues

    CERN Document Server

    Campbell, Duncan; Hearin, Andrew; Padmanabhan, Nikhil; Berlind, Andreas; Mo, H J; Tinker, Jeremy; Yang, Xiaohu

    2015-01-01

    We investigate the ability of current implementations of galaxy group finders to recover colour-dependent halo occupation statistics. To test the fidelity of group catalogue inferred statistics, we run three different group finders used in the literature over a mock that includes galaxy colours in a realistic manner. Overall, the resulting mock group catalogues are remarkably similar, and most colour-dependent statistics are recovered with reasonable accuracy. However, it is also clear that certain systematic errors arise as a consequence of correlated errors in group membership determination, central/satellite designation, and halo mass assignment. We introduce a new statistic, the halo transition probability (HTP), which captures the combined impact of all these errors. As a rule of thumb, errors tend to equalize the properties of distinct galaxy populations (i.e. red vs. blue galaxies or centrals vs. satellites), and to result in inferred occupation statistics that are more accurate for red galaxies than f...

  14. Use assessment of electronic power sources for SMAW

    Directory of Open Access Journals (Sweden)

    Scotti, A.

    1999-04-01

    Full Text Available The aim of the present work was to assess the efficacy of the use of modern technologies for power supplies in Shielded Metal Are Welding (SMAW. Coupon tests were welded by using a series of five different classes of commercial electrodes, covering their current ranges. Both a conventional electromagnetic and an electronic (inverter power sources were employed. Fusion rate, deposition efficiency, bead finish and weld geometry were measured at each experiment. Current and voltage signals were acquired at a high rate to evaluate the dynamic behavior of the power sources. The static performances of both power sources were also determined. The results showed that despite the remarkable differences between the power supplies, based on static and dynamic characterizations, no significant difference was noticed in the operational behavior of the electrodes, in the given conditions, apart from a better anti-stick performance obtained with the electronic power source.

    El objetivo del presente trabajo fue evaluar la eficacia del uso de tecnologías modernas para fuentes de energía en soldaduras con electrodo revestido (Shielded Metal Are Welding -SMAW-. Los materiales de ensayo se soldaron usando una serie de cinco clases diferentes de electrodos comerciales, cubriendo sus rangos de corriente. Para esto se utilizó una fuente de energía electromagnética convencional y una fuente de energía electrónica (inversora. La tasa de fusión, eficiencia de deposición, terminación del cordón así como el diseño de la soldadura se midieron en cada experimento. Las señales de corriente y voltaje se obtuvieron a una proporción alta para evaluar el comportamiento dinámico de las fuentes de energía. También se determinó la actuación estática de ambas fuentes. Los resultados mostraron que a pesar de las diferencias notables entre los suministros de energía, no se nota diferencia alguna significante en la conducta de trabajo de los electrodos, en

  15. Descriptive statistics of occupational employment in nuclear power utilities. Final working paper

    International Nuclear Information System (INIS)

    The Institute of Nuclear Power Operations conducted a survey of its 58 member utilities during the Spring of 1982. This was the second such survey performed to identify employment trends and to project needs for trained personnel in the industry to 1991. The first was performed in 1981. The 1982 employment survey consisted of four questionnaires, asking for information on: (1) on-site employment; (2) on-site turnover; (3) off-site employment; and (4) off-site turnover. The survey instruments were designed to reflect approaches used by the utilities to meet the labor requirements for operation of nuclear power plants through off-site support personnel, contractors, and holding company personnel, as well as utility employees working at the plant site. On-site information was received from all 83 plants at the 58 utilities. However, employment information from Surry of VEPCO arrived too late to be included in the analysis. Therefore, their numbers are reflected in the adjusted totals. Responses to requests for off-site employment information were received from 55 of the 58 utilities

  16. Fast fMRI provides high statistical power in the analysis of epileptic networks.

    Science.gov (United States)

    Jacobs, Julia; Stich, Julia; Zahneisen, Benjamin; Assländer, Jakob; Ramantani, Georgia; Schulze-Bonhage, Andreas; Korinthenberg, Rudolph; Hennig, Jürgen; LeVan, Pierre

    2014-03-01

    EEG-fMRI is a unique method to combine the high temporal resolution of EEG with the high spatial resolution of MRI to study generators of intrinsic brain signals such as sleep grapho-elements or epileptic spikes. While the standard EPI sequence in fMRI experiments has a temporal resolution of around 2.5-3s a newly established fast fMRI sequence called MREG (Magnetic-Resonance-Encephalography) provides a temporal resolution of around 100ms. This technical novelty promises to improve statistics, facilitate correction of physiological artifacts and improve the understanding of epileptic networks in fMRI. The present study compares simultaneous EEG-EPI and EEG-MREG analyzing epileptic spikes to determine the yield of fast MRI in the analysis of intrinsic brain signals. Patients with frequent interictal spikes (>3/20min) underwent EEG-MREG and EEG-EPI (3T, 20min each, voxel size 3×3×3mm, EPI TR=2.61s, MREG TR=0.1s). Timings of the spikes were used in an event-related analysis to generate activation maps of t-statistics. (FMRISTAT, |t|>3.5, cluster size: 7 voxels, p<0.05 corrected). For both sequences, the amplitude and location of significant BOLD activations were compared with the spike topography. 13 patients were recorded and 33 different spike types could be analyzed. Peak T-values were significantly higher in MREG than in EPI (p<0.0001). Positive BOLD effects correlating with the spike topography were found in 8/29 spike types using the EPI and in 22/33 spikes types using the MREG sequence. Negative BOLD responses in the default mode network could be observed in 3/29 spike types with the EPI and in 19/33 with the MREG sequence. With the latter method, BOLD changes were observed even when few spikes occurred during the investigation. Simultaneous EEG-MREG thus is possible with good EEG quality and shows higher sensitivity in regard to the localization of spike-related BOLD responses than EEG-EPI. The development of new methods of analysis for this sequence such as

  17. A flexible and comprehensive approach to the assessment of large-scale power system security under uncertainty

    International Nuclear Information System (INIS)

    A background of increasing uncertainty in all time horizons of power system planning and operation has prompted the development of a comprehensive methodology and an advanced new practical tool for assessing both the static and dynamic security of a real network facing a large number of uncertainties. This paper describes this development, the steps of a study methodology and the incorporation of advanced security-constrained optimal power flow, dynamic simulation, statistical analysis tools and a specially developed set of sampling functions into a unique environment. The possible hardware set-ups are outlined and, finally, some examples of the tool's application are given. (author)

  18. PowerStaTim 1.0 – un nou program statistic de calcul a mărimii efectului și a puterii statistice

    Directory of Open Access Journals (Sweden)

    Florin A. Sava

    2008-01-01

    Full Text Available The present paper presents the main characteristics of a new software for computing effect size and statistical power indicators: PowerStaTim 1.0 (Maricuțoiu & Sava, 2007. The first part of the present paper presents the rationale for computing effect size and statistical power in psychological research. The second part of the article introduces the reader to the technical characteristics of PowerStaTim 1.0 and to the processing options of this software.

  19. Communications and control for electric power systems: Power flow classification for static security assessment

    Science.gov (United States)

    Niebur, D.; Germond, A.

    1993-01-01

    This report investigates the classification of power system states using an artificial neural network model, Kohonen's self-organizing feature map. The ultimate goal of this classification is to assess power system static security in real-time. Kohonen's self-organizing feature map is an unsupervised neural network which maps N-dimensional input vectors to an array of M neurons. After learning, the synaptic weight vectors exhibit a topological organization which represents the relationship between the vectors of the training set. This learning is unsupervised, which means that the number and size of the classes are not specified beforehand. In the application developed in this report, the input vectors used as the training set are generated by off-line load-flow simulations. The learning algorithm and the results of the organization are discussed.

  20. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    CERN Document Server

    Wheatley, Spencer; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...

  1. Wind power prognosis statistical system; Sistema estadistico de pronostico de la energia eoloelectrica

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Garcia, Alfredo; De la Torre Vega, Eli [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2009-07-01

    The integration of the first Aeolian farm of large scale (La Venta II) to the National Interconnected System requires taking into account the random and discontinuous nature of the Aeolian energy. An important tool, for this task, is a system for the prognosis of the Aeolian energy in the short term. For this reason, the Instituto of Investigaciones Electricas (IIE) developed a statistical model to realize this prognosis. The prediction is done through an adaptable linear combination of alternative competing models, where the weights given to each model are based on its more recent prognosis quality. Also, the application results of the prognoses system are presented and analyzed. [Spanish] La integracion de la primera grana eolica de gran escala (La Venta II) al Sistema Interconectado Nacional requiere tomar en cuenta la naturaleza aleatoria y discontinua de la energia eolica. Una importante herramienta, para esta tarea, es un sistema para el pronostico de la energia eolica a corto plazo. Por ello, el Instituto de Investigaciones Electricas (IIE) desarrollo un modelo estadistico para realizar este pronostico. La prediccion es hecha a traves de una combinacion lineal adaptable de modelos competidores alternativos, donde los pesos dados a cada modelo estan basados en su mas reciente calidad de pronostico. Tambien se presentan y analizan los resultados de la aplicacion del sistema de pronosticos.

  2. Assessment of statistical procedures used in papers in the Australian Veterinary Journal.

    Science.gov (United States)

    McCance, I

    1995-09-01

    One hundred and thirty-three papers (80 Original Articles and 53 Short Contributions) of 279 papers in 23 consecutive issues of the Australian Veterinary Journal were examined for their statistical content. Only 38 (29%) would have been acceptable to a statistical referee without revision, revision would have been indicated in 88 (66%), and the remaining 7 (5%) had major flaws. Weaknesses in design were found in 40 (30%), chiefly in respect to randomisation and to the size of the experiment. Deficiencies in analysis in 60 (45%) were in methods, application and calculation, and in the failure to use appropriate methods for multiple comparisons and repeated measures. Problems were detected in presentation in 44 (33%) of papers, with insufficient information about the data or its statistical analysis and presentation of statistics (appropriate missing or inappropriate shown) the main problems. Conclusions were considered to be inconsistent with the analysis in 35 (26%) of papers, due mainly to their interpretation of the results of significance testing. It is suggested that statistical refereeing, the publication of statistical guidelines for authors and statistical advice to Animal Experimentation Ethics Committees could all play a part in achieving improvement. PMID:8585846

  3. The Power (Law) of Indian Markets: Analysing NSE and BSE trading statistics

    CERN Document Server

    Sinha, S; Sinha, Sitabhra; Pan, Raj Kumar

    2006-01-01

    The nature of fluctuations in the Indian financial market is analyzed in this paper. We have looked at the price returns of individual stocks, with tick-by-tick data from the National Stock Exchange (NSE) and daily closing price data from both NSE and the Bombay Stock Exchange (BSE), the two largest exchanges in India. We find that the price returns in Indian markets follow a fat-tailed cumulative distribution, consistent with a power law having exponent $\\alpha \\sim 3$, similar to that observed in developed markets. However, the distributions of trading volume and the number of trades have a different nature than that seen in the New York Stock Exchange (NYSE). Further, the price movement of different stocks are highly correlated in Indian markets.

  4. Generation adequacy assessment for power systems with wind turbine and energy storage

    OpenAIRE

    Zhong, J.(Department of Physics, Oxford University, Oxford, United Kingdom); R Zheng

    2010-01-01

    Wind power has been considered as an environmental friendly electrical generation resource; however, the high wind power penetration can lead to high-risk levels in power system reliability. Energy storage system (ESS) is a promising means to smooth variations of wind power and improve the system reliability. Simulation models for assessing generation adequacies of power systems with wind power generation system (WPGS) and ESS are presented in this paper. The impacts of different wind power p...

  5. THE APPLICATION OF STATISTICAL PARAMETERS OF PHASE RESOLVED PD DISTRIBUTION TO AGING EXTENT ASSESSMENT OF LARGE GENERATOR INSULATION

    Institute of Scientific and Technical Information of China (English)

    谢恒堃; 乐波; 孙翔; 宋建成

    2003-01-01

    Objective To investigate the characteristic parameters employed to describe the aging extent of stator insulation of large generator and study the aging laws. Methods Multi-stress aging tests of model generator stator bar specimens were performed and PD measurements were conducted using digital PD detector with frequency range from 40*!kHz to 400*!kHz at different aging stage. Results From the test results of model specimens it was found that the skewness of phase resolved PD distribution might be taken as the characterization parameters for aging extent assessment of generator insulation. Furthermore, the measurement results of actual generator stator bars showed that the method based on statistical parameters of PD distributions are prospective for aging extent assessment and residual lifetime estimation of large generator insulation. Conclusion Statistical parameters of phase resolved PD distribution was proposed for aging extent assessment of large generator insulation.

  6. Decrease in risk erroneous classification in the multivariate statistical data describing the technical condition of the equipment of power supply systems

    International Nuclear Information System (INIS)

    Objective estimation of parameters of individual reliability is an indispensable condition of an opportunity of decrease in operational expenses for maintenance service and repair of the equipment and devices of electro power systems. The method of decrease in risk of erroneous classification of multivariate statistical data is offered. The method based on imitating modeling and the theory of check of statistical hypotheses

  7. Sensitivity and uncertainty analyses in external cost assessments of fusion power

    Energy Technology Data Exchange (ETDEWEB)

    Aquilonius, K. E-mail: karin.aquilonius@studsvik.se; Hallberg, B.; Hofman, D.; Bergstroem, U.; Lechon, Y.; Cabal, H.; Saez, R.M.; Schneider, T.; Lepicard, S.; Ward, D.; Hamacher, T.; Korhonen, R

    2001-11-01

    Analysis of sensitivity and uncertainty of assessment models for external costs, which is monetarization of environmental impacts, of a commercial fusion plant were performed. The assessments covered the plant's entire life cycle, and adopted the ExternE methodology, which had been used to calculate external costs from other energy sources. Based on the SEAFP study, three different power plant designs were considered. The method developed in ExternE to estimate uncertainty gave very large ranges. A statistical error propagation method was employed for this study. Rather than as a single value, model input parameter values were given as distributions, from which random input sets of data were constructed. The models were then run with these sets, and the ensemble of output results was analysed statistically, yielding estimates of the uncertainty due to variation of the model parameteres. More information of parameter variation is needed for a more realistic estimation of model uncertainty, though. Sensitivity analyses were performed by varying all input parameters in a similar fashion. All model parameters were assumed to have a gaussian distribution with standard deviations of 10% of the mean value. The results pointed out the most essential parameters of the models. The sensitivity analyses are also useful for estimating the most effective ways to reduce the model computed external costs.

  8. ASTRID power conversion system: Assessment on steam and gas options

    International Nuclear Information System (INIS)

    Conclusion: ◆ Two power conversion systems have been investigated for the ASTRID prototype. ◆ Steam PCS: • Most mature system based on a well-developed turbomachinery technology. • High plant efficiency. • Studies on steam generators designs and leak detection systems in progress with the aim of reducing the risk of large SWRs and of limiting its consequences. • Design and licensing safety assessment of a SFR must deal with the Sodium Water Air reaction (SWAR). ◆ Gas PCS: • Strong advantage as it inherently eliminates the SWR and SWAR risks. • Very innovative option: major breakthroughs but feasibility and viability not yet demonstrated. • Remaining technological challenges but no showstopper indentified. • General architecture: investigations in progress to improve performances, operability and maintainability

  9. ASSESSMENT OF THE DRUM REMAINING LIFETIME IN THERMAL POWER PLANT

    Directory of Open Access Journals (Sweden)

    Miroslav M Živković

    2010-01-01

    Full Text Available In this paper analysis of stress and thermal-elastic-plastic strain of the drum is performed. Influence of modified thickness, yield stress and finite element model of welded joint between pipe and drum on assessment of the remaining lifetime of the drum in the thermal power plant is analyzed. Two analyses are compared. In the first, drum is modeled by shell and by 3D finite elements with projected geometrical and material data of drum. Then, the drum is modeled by shell and by 3D finite elements with modified thickness and yield stress. The analysis show that detailed modeling of stress concentration zones is necessary. Adequate modeling gives lower maximal effective plastic strain and increased number of cycles and, in that case, 3D finite elements are better comparing to shell finite elements.

  10. Regional wind energy assessment program progress report, October 1980-September 1981. Appendix. Wind statistics summaries

    Energy Technology Data Exchange (ETDEWEB)

    Baker, R W; Wade, J E; Persson, P O.G.; Armstrong, B

    1981-12-01

    The wind statistics summarized include monthly wind speed and spectrum analyzer summaries, diurnal wind speed tables, high wind summaries (greater than or equal to 50 mph), wind rose tables, and wind speed and direction frequency distributions. (LEW)

  11. Statistical decision theory and its application to PRA result evaluation for nuclear power plant designing process

    International Nuclear Information System (INIS)

    Decision theory is applied to derive the ''α-th'' percentile and the mean value decision rules which have often been referenced in Probabilistic Risk Assessment (PRA) results. It is shown that the decision problem, with certain kinds of utility functions, yields the above decision rules as well as the criteria of these decision rules. Decision lines are developed as a function of the median and uncertainty factor for an a-priori log normal distribution, and are shown to be useful for decision maker's immediate judgement based on the ''α-th'' percentile and the mean value decision rules. Finally the PWR and BWR release categories of WASH-1400 are evaluated by the developed decision lines with the criteria assuming, as an example, 10-4/reactor.year and 10-3/reactor.year for 95-th percentile decision rule, and 10-5/reactor.year and 10-4/reactor.year respectively

  12. Exon array data analysis using Affymetrix power tools and R statistical software

    Science.gov (United States)

    2011-01-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform. PMID:21498550

  13. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia).

    Science.gov (United States)

    Caneva, G; Bartoli, F; Savo, V; Futagami, Y; Strona, G

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  14. Scan statistic tail probability assessment based on process covariance and window size

    OpenAIRE

    Reiner-Benaim, Anat

    2013-01-01

    A scan statistic is examined for the purpose of testing the existence of a global peak in a random process with dependent variables of any distribution. The scan statistic tail probability is obtained based on the covariance of the moving sums process, thereby accounting for the spatial nature of the data as well as the size of the searching window. Exact formulas linking this covariance to the window size and the correlation coefficient are developed under general, common and auto covariance...

  15. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Science.gov (United States)

    Tonkin, Matthew J.; Tiedeman, Claire R.; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  16. Statistical approach for assessing the influence of synoptic and meteorological conditions on ozone concentrations over Europe

    Science.gov (United States)

    Otero, Noelia; Butler, Tim; Sillmann, Jana

    2015-04-01

    Air pollution has become a serious problem in many industrialized and densely-populated urban areas due to its negative effects on human health, damages agricultural crops and ecosystems. The concentration of air pollutants is the result of several factors, including emission sources, lifetime and spatial distribution of the pollutants, atmospheric properties and interactions, wind speed and direction, and topographic features. Episodes of air pollution are often associated with stationary or slowly migrating anticyclonic (high-pressure) systems that reduce advection, diffusion, and deposition of atmospheric pollutants. Certain weather conditions facilitate the concentration of pollutants, such as the incidence of light winds that contributes to the increasing of stagnation episodes affecting air quality. Therefore, the atmospheric circulation plays an important role in air quality conditions that are affected by both, synoptic and local scale processes. This study assesses the influence of the large-scale circulation along with meteorological conditions on tropospheric ozone in Europe. The frequency of weather types (WTs) is examined under a novel approach, which is based on an automated version of the Lamb Weather Types catalog (Jenkinson and Collison, 1977). Here, we present an implementation of such classification point-by-point over the European domain. Moreover, the analysis uses a new grid-averaged climatology (1°x1°) of daily surface ozone concentrations from observations of individual sites that matches the resolution of global models (Schnell,et al., 2014). Daily frequency of WTs and meteorological conditions are combined in a multiple regression approach for investigating the influence on ozone concentrations. Different subsets of predictors are examined within multiple linear regression models (MLRs) for each grid cell in order to identify the best regression model. Several statistical metrics are applied for estimating the robustness of the

  17. Statistical Power Law due to Reservoir Fluctuations and the Universal Thermostat Independence Principle

    Directory of Open Access Journals (Sweden)

    Tamás Sándor Biró

    2014-12-01

    Full Text Available Certain fluctuations in particle number, \\(n\\, at fixed total energy, \\(E\\, lead exactly to a cut-power law distribution in the one-particle energy, \\(\\omega\\, via the induced fluctuations in the phase-space volume ratio, \\(\\Omega_n(E-\\omega/\\Omega_n(E=(1-\\omega/E^n\\. The only parameters are \\(1/T=\\langle \\beta \\rangle=\\langle n \\rangle/E\\ and \\(q=1-1/\\langle n \\rangle + \\Delta n^2/\\langle n \\rangle^2\\. For the binomial distribution of \\(n\\ one obtains \\(q=1-1/k\\, for the negative binomial \\(q=1+1/(k+1\\. These results also represent an approximation for general particle number distributions in the reservoir up to second order in the canonical expansion \\(\\omega \\ll E\\. For general systems the average phase-space volume ratio \\(\\langle e^{S(E-\\omega}/e^{S(E}\\rangle\\ to second order delivers \\(q=1-1/C+\\Delta \\beta^2/\\langle \\beta \\rangle^2\\ with \\(\\beta=S^{\\prime}(E\\ and \\(C=dE/dT\\ heat capacity. However, \\(q \

  18. Citation Statistics

    OpenAIRE

    Adler, Robert; Ewing, John; Taylor, Peter

    2009-01-01

    This is a report about the use and misuse of citation data in the assessment of scientific research. The idea that research assessment must be done using ``simple and objective'' methods is increasingly prevalent today. The ``simple and objective'' methods are broadly interpreted as bibliometrics, that is, citation data and the statistics derived from them. There is a belief that citation statistics are inherently more accurate because they substitute simple numbers for complex judgments, and...

  19. Animal-powered tillage erosion assessment in the southern Andes region of Ecuador

    Science.gov (United States)

    Dercon, G.; Govers, G.; Poesen, J.; Sánchez, H.; Rombaut, K.; Vandenbroeck, E.; Loaiza, G.; Deckers, J.

    2007-06-01

    While water erosion has been the focus of past research in the Andes, former studies show that soil erosion could also be related to the methods used in cultivating the fields. The main objective of the present study was to assess (i) tillage erosion caused by the traditional animal-powered "yunta" or ard plough in the Andes and the factors controlling the process and (ii) the implications for soil conservation. Erosion rates were experimentally measured on 27 sites, having slopes from ca. 0% to 60% and soils ranging from Andosols to Cambisols, in the Andes region of Ecuador (Gima, Azuay). Different tillage methods were assessed: (i) tillage parallel to the contour lines ('Paralelo') and (ii) tillage at an angle with the contour lines. Statistical analysis points out that erosion caused by animal-powered tillage is gravity-driven. A strong correlation exists between slope and downslope displacement: furthermore, tillage depth and initial soil condition are important. For the 'Paralelo' tillage method the tillage transportation coefficient ( k) is below 100 kg m - 1 Tillage Pass - 1 , for the combined 'Arado'-'Cruzado' tillage method k may exceed 300 kg m - 1 . Tillage erosion is responsible for the reduction of the slope between the contour strips over a relatively short time period of 20 years, resulting in the formation of terraces and therefore the reduction of the water erosion risk. However, at the same time it may negatively affect soil quality.

  20. Investigation of competition within the international wind power market. Supplementary report 1. Supplement 1: Actor profiles. Supplement 2: Note on India. Supplement 3: Statistics - tables

    International Nuclear Information System (INIS)

    The supplement to the report with the same title presents profiles of some wind turbine manufacturers located in European countries, U.S.A. and Japan, notes on wind power in India, and statistics and tables relevant to the wind power market. (AB)

  1. A Fractional Lower Order Statistics-Based MIMO Detection Method in Impulse Noise for Power Line Channel

    Directory of Open Access Journals (Sweden)

    CHEN, Z.

    2014-11-01

    Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.

  2. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.

  3. Discrimination power of short-term heart rate variability measures for CHF assessment.

    Science.gov (United States)

    Pecchia, Leandro; Melillo, Paolo; Sansone, Mario; Bracale, Marcello

    2011-01-01

    In this study, we investigated the discrimination power of short-term heart rate variability (HRV) for discriminating normal subjects versus chronic heart failure (CHF) patients. We analyzed 1914.40 h of ECG of 83 patients of which 54 are normal and 29 are suffering from CHF with New York Heart Association (NYHA) classification I, II, and III, extracted by public databases. Following guidelines, we performed time and frequency analysis in order to measure HRV features. To assess the discrimination power of HRV features, we designed a classifier based on the classification and regression tree (CART) method, which is a nonparametric statistical technique, strongly effective on nonnormal medical data mining. The best subset of features for subject classification includes square root of the mean of the sum of the squares of differences between adjacent NN intervals (RMSSD), total power, high-frequencies power, and the ratio between low- and high-frequencies power (LF/HF). The classifier we developed achieved sensitivity and specificity values of 79.3 % and 100 %, respectively. Moreover, we demonstrated that it is possible to achieve sensitivity and specificity of 89.7 % and 100 %, respectively, by introducing two nonstandard features ΔAVNN and ΔLF/HF, which account, respectively, for variation over the 24 h of the average of consecutive normal intervals (AVNN) and LF/HF. Our results are comparable with other similar studies, but the method we used is particularly valuable because it allows a fully human-understandable description of classification procedures, in terms of intelligible "if … then …" rules. PMID:21075731

  4. Inter-speaker speech variability assessment using statistical deformable models from 3.0 Tesla magnetic resonance images

    OpenAIRE

    Maria JM Vasconcelos; Sandra MR Ventura; Diamantino RS Freitas; João Manuel RS Tavares

    2012-01-01

    The morphological and dynamic characterization of the vocal tract during speech production has been gaining greater attention due to the motivation of the latest improvements in Magnetic Resonance (MR) imaging; namely, with the use of higher magnetic fields, such as 3.0 Tesla. In this work, the automatic study of the vocal tract from 3.0 Tesla MR images was assessed through the application of statistical deformable models. Therefore, the primary goal focused on the analysis of the shape of th...

  5. Assessing water footprint of wheat production in China using a crop-model-coupled-statistics approach

    Directory of Open Access Journals (Sweden)

    X. C. Cao

    2014-01-01

    Full Text Available The aim of this study is to estimate the green and blue water footprint of wheat, distinguishing the irrigated and rain-fed crop, from a production perspective. The assessment herein focuses on China and improves upon earlier research by taking a crop-model-coupled-statistics approach to estimate the water footprint of the crop in 30 provinces. We have calculated the water footprint at regional scale based on the actual data collected from 442 typical irrigation districts. Crop evapotranspiration and the water conveyance loss are both considered in calculating irrigated water footprint at the regional scale. We have also compared water footprint of per unit product between irrigated and rain-fed crops and analyzed the relationship between promoting yield and saving water resources. The national wheat production in the year 2010 takes about 142.5 billion cubic meters of water. The major portion of WF (80.9% comes from the irrigated farmland and the remaining 19.1% falls into the rain-fed. Green water (50.3% and blue water (49.7% carry almost equal shares of water footprint (WF in total cropland WF. Green water dominates the south of the Yangtze River, whereas low green water proportions relate themselves to the provinces located in the north China especially northwest China. Approximately 38.5% of the water footprint related to the production of wheat is not consumed in the form of crop evapotranspiration but of conveyance loss during irrigation process. Proportions of blue water for conveyance loss (BWCL in the arid Xinjiang, Ningxia and Neimenggu (Inner Mongolia exceed 40% due to low irrigation efficiency. The national average water footprint of wheat per unit of crop (WFP is 1.237 m3 kg−1 in 2010. There exists a big difference in WFP among provinces. Compared to the rain-fed cultivation (with no irrigation, irrigation has promoted crop yield, both provincially and up by about 170% nationally. As a result, more water resources are demanded in

  6. Indicators for assessing the safety level of nuclear power plants

    International Nuclear Information System (INIS)

    Since the political opening of the states of Central and Eastern Europe roughly one decade ago, Western industrialized countries in particular have been striving to achieve sustainable improvements in the safety of nuclear reactors in those countries. One objective of these efforts is to ensure a high level of nuclear safety and safety culture in line with worldwide endeavors. The enlargement of the European Union in the very near future offers an opportunity for reaching this goal in the participating countries. Existing international framework agreements refer to the appropriate safety guidelines. At EU level, the harmonization of nuclear safety standards has been an important topic for years, with specific constructive activities being initiated, e.g., by the industry and by regulatory authorities. Uniform safety standards should not be the basis of proven reviews conducted by the national licensing and supervisory authorities. The objective should be the development of key requirements as framework conditions, irrespective of their practical implementation. They could be applied to any nuclear power plant in an accession country, but likewise to plants in member states, in order to provide an overview of the current safety status of a nuclear power plant and the rules by which it is run. As deriving uniform safety standards is a very expensive and lengthy procedure, the approach shown here identifies six main areas of review for light water reactors (safety systems; integrity of the safety barriers; risk assessment; radiation exposure of the plant personnel and the environment; plant operations management; plant safety) and the associated safety indicators, with reference criteria formulated as concretely as possible. This proposal also lends itself to international individual evaluations of safety levels and could facilitate the review process already under way for the EU candidate countries. (orig.)

  7. Probabilistic risk assessment course documentation. Volume 2. Probability and statistics for PRA applications

    International Nuclear Information System (INIS)

    This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics

  8. A Mixed-Methods Assessment of Using an Online Commercial Tutoring System to Teach Introductory Statistics

    Science.gov (United States)

    Xu, Yonghong Jade; Meyer, Katrina A.; Morgan, Dianne D.

    2009-01-01

    This study used a mixed-methods approach to evaluate a hybrid teaching format that incorporated an online tutoring system, ALEKS, to address students' learning needs in a graduate-level introductory statistics course. Student performance in the hybrid course with ALEKS was found to be no different from that in a course taught in a traditional…

  9. Assessing the Disconnect between Grade Expectation and Achievement in a Business Statistics Course

    Science.gov (United States)

    Berenson, Mark L.; Ramnarayanan, Renu; Oppenheim, Alan

    2015-01-01

    In an institutional review board--approved study aimed at evaluating differences in learning between a large-sized introductory business statistics course section using courseware assisted examinations compared with small-sized sections using traditional paper-and-pencil examinations, there appeared to be a severe disconnect between the final…

  10. Dynamic Assessment of Radon Source in Buildings, Based on Tracer Gas Experiment Statistical Modeling

    Czech Academy of Sciences Publication Activity Database

    Brabec, Marek; Jílek, K.

    Hauppauge: Nova Science Publishers, 2012 - (Li, Z.; Feng, C.), s. 211-242 ISBN 978-1-62100-177-5 Institutional research plan: CEZ:AV0Z10300504 Keywords : radon * dynamic modeling * functional data analysis Subject RIV: BB - Applied Statistics, Operational Research https://www.novapublishers.com/catalog/product_info.php?products_id=23545

  11. Statistical Analysis of Wind Power Density Based on the Weibull and Rayleigh Models of Selected Site in Malaysia

    Directory of Open Access Journals (Sweden)

    Aliashim Albani

    2014-02-01

    Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.

  12. Steam generator assessment for sustainable power plant operation

    International Nuclear Information System (INIS)

    Water and steam serve in the water-steam cycle as the energy transport and work media. These fluids shall not affect, through corrosion processes on the construction materials and their consequences, undisturbed plant operation. The main objectives of the steam water cycle chemistry consequently are: - The metal release rates of the structural materials shall be minimal - The probability of selective / localized forms of corrosion shall be minimal. - The deposition of corrosion products on heat transfer surfaces shall be minimized. - The formation of aggressive media, particularly local aggressive environments under deposits, shall be avoided. These objectives are especially important for the steam generators (SGs) because their condition is a key factor for plant performance, high plant availability, life time extension and is important to NPP safety. The major opponent to that is corrosion and fouling of the heating tubes. Effective ways of counteracting all degradation problems and thus of improving the SG performance are to keep SGs in clean conditions or if necessary to plan cleaning measures such as mechanical tube sheet lancing or chemical cleaning. Based on more than 40 years of experience in steam-water cycle water chemistry treatment AREVA developed an overall methodology assessing the steam generator cleanliness condition by evaluating all available operational and inspection data together. In order to gain a complete picture all relevant water chemistry data (e.g. corrosion product mass balances, impurity ingress), inspection data (e.g. visual inspections and tube sheet lancing results) and thermal performance data (e.g. heat transfer calculations) are evaluated, structured and indexed using the AREVA Fouling Index Tool Box. This Fouling Index Tool Box is more than a database or statistical approach for assessment of plant chemistry data. Furthermore the AREVA's approach combines manufacturer's experience with plant data and operates with an

  13. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    International Nuclear Information System (INIS)

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfill special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfill these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes. (author)

  14. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    Science.gov (United States)

    Mittendorfer, J.; Zwanziger, P.

    2000-03-01

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfil special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfil these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes.

  15. Assessing the decennial, reassessing the global:Understanding European Union normative power in global politics

    OpenAIRE

    Manners, Ian James

    2013-01-01

    This concluding article assesses the past decade of international scholarship on the European Union (EU) and normative power as represented by the contributions to the special issue. It argues that the normative power approach (NPA) makes it possible to explain, understand and judge the EU in global politics by rethinking the nature of power and actorness in a globalizing, multilateralizing and multipolarizing era. To do this, the article assesses the past decade in terms of normative power e...

  16. Transient Stability Assessment of Smart Power System using Complex Networks Framework

    CERN Document Server

    Nasiruzzaman, A B M

    2011-01-01

    In this paper, a new methodology for stability assessment of a smart power system is proposed. The key to this assessment is an index called betweenness index which is based on ideas from complex network theory. The proposed betweenness index is an improvement of previous works since it considers the actual real power flow through the transmission lines along the network. Furthermore, this work initiates a new area for complex system research to assess the stability of the power system.

  17. Assessing welfare impact of entry into power market

    International Nuclear Information System (INIS)

    This paper calculates the welfare impact of a new entrant based on the location of entry in the Korean electricity market. We use two different models. One is the optimal fuel mix model to estimate the effect of a new entry in the long run. The other is the variable cost minimization model to assess the contribution of an existing installed private generator in the short run. A specific private generator, which has a cost advantage, saves a substantial amount of system-wide variable costs. We show that the right location for a new entrant can save power generation costs significantly, even if a new entrant does not have a cost advantage. - Highlights: • This paper calculates the welfare impact of a new entrant based on the location of entry. • We use two different models to estimate the entry effect. • The minimum and maximum cost savings of a new entrant are about 0.3% and 0.84% of total generation cost. • Even if a new entrant has no cost advantage, its choice of location could save money

  18. Wind power in Eritrea, Africa: A preliminary resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, K.; Rosen, K. [San Jose State Univ., CA (United States); Van Buskirk, R. [Dept. of Energy, Eritrea (Ethiopia)

    1997-12-31

    The authors preliminary assessment of Eritrean wind energy potential identified two promising regions: (1) the southeastern Red Sea coast and (2) the mountain passes that channel winds between the coastal lowlands and the interior highlands. The coastal site, near the port city of Aseb, has an exceptionally good resource, with estimated average annual wind speeds at 10-m height above 9 m/s at the airport and 7 m/s in the port. Furthermore, the southern 200 km of coastline has offshore WS{sub aa} > 6 m/s. This area has strong potential for development, having a local 20 MW grid and unmet demand for the fishing industry and development. Although the highland sites contain only marginal wind resources ({approximately} 5 m/s), they warrant further investigation because of their proximity to the capital city, Asmera, which has the largest unmet demand and a larger power grid (40 MW with an additional 80 MW planned) to absorb an intermittent source without storage.

  19. Assessment of Feeder Wall Thinning of Wolsong Nuclear Power Plants

    International Nuclear Information System (INIS)

    The reactor of CANDUs of Wolsong Nuclear Power generating station is composed of 380 pressure tubes. The primary heat transport circuit of CANDU connects each pressure tube to headers on the way to and from the steam generators. The feeder is A-106 carbon steel, and suffers from wall thinning by Flow Accelerated Corrosion. Excessive thinning deteriorates the pressure retaining capability of piping so that the minimum allowable thickness of feeder should be maintained throughout the life of feeder. The feeder wall thinning should be monitored by in-service inspection. Knowledge-based inspection strategy needs to be developed since combination of high radiation field and geometric restriction near the tight bend location makes extensive inspection very difficult. A thermo hydraulic assessment using computational fluid dynamics software and feeder wall thinning simulation experiments using plaster of Paris may provide valuable information to understand characteristic features of the feeder wall thinning. Plant in-service inspection database may be another source of valuable information. This paper summarizes a review of feeder wall thinning in Wolsong CANDU station. W-1 feeder suffered significant thinning so that it is being replaced along with the plant refurbishment campaign. The other units, W-2∼4, are still in the early portion of their operation life. A result of feeder wall thinning simulation test using plaster of Paris is presented. The knowledge presented in this paper is important information to design a knowledge-based in-service inspection program of feeder wall thinning

  20. The assessment of the environmental external costs of power plants for both coal-fired plant and nuclear power plant

    International Nuclear Information System (INIS)

    Efforts were made to assess the environmental external costs of power plants for both Samchonpo coal-fired plant and Younggwang nuclear power plant by using the computer program developed by the IAEA. In the case that the emission control devices such as precipitator for particulates reduction, wet scrubber for SO2, and low-NOx burner for NOx were installed in the coal-fired power plant, total environmental external cost was estimated as 33.97Won/kWh, much higher than 0.76Won/kWh of Younggwang nuclear power plant. And this study also assessed and compared the environmental external costs when Younggwang nuclear power plant was replaced by the coal-fired power plant at the same site and with the same capacity. According to the result, total environmental external cost of coal-fired power plant, with the emisison control devices installed, was estimated as 792 million US$ and it was about 50 times higher than 15 million US$ of Younggwang nuclear power plant. Although the result of this study had some limits due to using the simplified model, it was still true that nuclear power was much more environmentally friendly power source than coal-fired power

  1. Hanford groundwater modeling: statistical methods for evaluating uncertainty and assessing sampling effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, D.B.

    1979-01-01

    This report is the first in a series of three documents which address the role of uncertainty in the Rockwell Hanford Operations groundwater model development and application program at Hanford Site. Groundwater data collection activities at Hanford are reviewed as they relate to Rockwell groundwater modeling. Methods of applying statistical and probability theory in quantifying the propagation of uncertainty from field measurements to model predictions are discussed. It is shown that measures of model accuracy or uncertainty provided by a statistical analysis can be useful in guiding model development and sampling network design. Recommendations are presented in the areas of model input data needs, parameter estimation data needs, and model verification and variance estimation data needs. 8 figures.

  2. Assessing whether there is a cancer premium for the value of a statistical life.

    Science.gov (United States)

    Viscusi, W Kip; Huber, Joel; Bell, Jason

    2014-04-01

    This article estimates whether there is a cancer risk premium for the value of a statistical life using stated preference valuations of cancer risks for a large, nationally representative US sample. The present value of an expected cancer case that occurs after a one decade latency period is $10.85m, consistent with a cancer premium that is 21% greater than the median value of a statistical life estimates for acute fatalities. This cancer premium is smaller than the premium proposed for policy analyses in the UK and the USA. There is also a greater premium for policies that reduce cancer risks to zero and for risk reductions affecting those who perceive themselves to have a greater than average probability of having cancer. PMID:23520055

  3. Hanford groundwater modeling: statistical methods for evaluating uncertainty and assessing sampling effectiveness

    International Nuclear Information System (INIS)

    This report is the first in a series of three documents which address the role of uncertainty in the Rockwell Hanford Operations groundwater model development and application program at Hanford Site. Groundwater data collection activities at Hanford are reviewed as they relate to Rockwell groundwater modeling. Methods of applying statistical and probability theory in quantifying the propagation of uncertainty from field measurements to model predictions are discussed. It is shown that measures of model accuracy or uncertainty provided by a statistical analysis can be useful in guiding model development and sampling network design. Recommendations are presented in the areas of model input data needs, parameter estimation data needs, and model verification and variance estimation data needs. 8 figures

  4. A statistical toolbox for metagenomics: assessing functional diversity in microbial communities

    OpenAIRE

    Handelsman Jo; Schloss Patrick D

    2008-01-01

    Abstract Background The 99% of bacteria in the environment that are recalcitrant to culturing have spurred the development of metagenomics, a culture-independent approach to sample and characterize microbial genomes. Massive datasets of metagenomic sequences have been accumulated, but analysis of these sequences has focused primarily on the descriptive comparison of the relative abundance of proteins that belong to specific functional categories. More robust statistical methods are needed to ...

  5. Assessment of Surface Water Quality in Hyderabad Lakes by Using Multivariate Statistical Techniques, Hyderabad-India

    OpenAIRE

    A. Sridhar Kumar; A. Madhava Reddy; L. Srinivas; P. Manikya Reddy

    2015-01-01

    Multivariate statistical techniques such as cluster analysis (CA), principle component analysis (PCA), factor analysis (FA) were applied for the evolution of temporal variations and the interpretation of large complex water quality data set of the Hyderabad city, generating during year 2013-14 monitoring of 16 parameters at 23 different sites of an average depth of 1m. Hierarchical clustering analysis (CA) is first applied to distinguish the three general water quality patterns among the stat...

  6. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    OpenAIRE

    Christos Chalkias; Christos Polykretis; Maria Ferentinou; Efthimios Karymbalis

    2016-01-01

    In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI) approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP) and Peak Ground Acceleration (PGA)—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the stu...

  7. A statistical concept to assess the uncertainty in Bayesian model weights and its impact on model ranking

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Nowak, Wolfgang

    2015-09-01

    Bayesian model averaging (BMA) ranks the plausibility of alternative conceptual models according to Bayes' theorem. A prior belief about each model's adequacy is updated to a posterior model probability based on the skill to reproduce observed data and on the principle of parsimony. The posterior model probabilities are then used as model weights for model ranking, selection, or averaging. Despite the statistically rigorous BMA procedure, model weights can become uncertain quantities due to measurement noise in the calibration data set or due to uncertainty in model input. Uncertain weights may in turn compromise the reliability of BMA results. We present a new statistical concept to investigate this weighting uncertainty, and thus, to assess the significance of model weights and the confidence in model ranking. Our concept is to resample the uncertain input or output data and then to analyze the induced variability in model weights. In the special case of weighting uncertainty due to measurement noise in the calibration data set, we interpret statistics of Bayesian model evidence to assess the distance of a model's performance from the theoretical upper limit. To illustrate our suggested approach, we investigate the reliability of soil-plant model selection following up on a study by Wöhling et al. (2015). Results show that the BMA routine should be equipped with our suggested upgrade to (1) reveal the significant but otherwise undetected impact of measurement noise on model ranking results and (2) to decide whether the considered set of models should be extended with better performing alternatives.

  8. Toxicity guidelines for waste assessment of nuclear power scenarios

    International Nuclear Information System (INIS)

    Under the provisions of the 1991 French radioactive waste management law, various fuel cycle scenarios will be assessed and compared in terms of feasibility, flexibility, cost, and ultimate waste radio-toxic inventory. The latter criterion may be further broken down into 'potential radio-toxic inventory' (the radio-toxic inventory of all the radionuclides produced) and 'residual radio-toxic inventory' (the radionuclide fraction reaching the biosphere alter migration from the repository). The innovative scientific contribution of this study is to consider a third type of radio-toxic inventory: the potential radio-toxic inventory alter conditioning, i.e. taking into account the containment capacity of the radionuclide conditioning matrices. The matrix fraction subjected to alteration over time determines the potential for radionuclide release, hence the notion of the potential radio-toxic inventory alter conditioning. An initial comparison of possible scenarios is proposed by considering orders of magnitude for the radionuclide containment capacity of the disposal matrices and for their mobilization potential. All the scenarios investigated are normalized to the same annual electric power production so that a legitimate comparison can be established for the ultimate wasteform produced per year of operation. This approach reveals significant differences among the scenarios considered that do not appear when only the raw potential radio-toxic inventory is taken into account. The matrix containment performance has a decisive effect on the final impact of a given scenario or type of scenario. Pu recycling scenarios thus reduce the potential radio-toxicity by roughly a factor of 50 compared with an open cycle; the gain rises to a factor of about 300 for scenarios in which Pu and the minor actinides are recycled. Interestingly, the results obtained by the use of a dedicated containment matrix for the minor actinides in a scenario limited to Pu recycling were comparable to

  9. Integration of Remote Sensing Techniques With Statistical Methods For Landslide Monitoring and Risk Assessment

    Science.gov (United States)

    van Westen, Cees; Wunderle, Stefan; Pasquali, Paolo

    In the frame of the Date User Program 2 (DUP) of the European Space Agency (ESA) a new method will be presented to derive landslide hazards, which was developed in close co-operation with the end users in Honduras and Switzerland, respectively. The objective of thi s project is to define a sustainable service using the novel approach based on the fusion of two independent methods, namely combining differential SAR Interferometry techniques (DInSAR) with a statistical approach. The bivariate statistical analysis is based on parameter maps (slope, geomorphology, land use) derived from remote sensing data and field checks as well as on historical aerial photos. The hybrid method is based on SAR data of the last years and new ENVISAT-ASAR data as well as historical data (i.e. former landslides detected in aerial photos), respectively. The historical occurrence of landslides will be combined with actual land sliding and creeping obtained from DInSAR. The landslide occurrence map in high quality forms the input for the statistical landslide hazard analysis. The method intends to derive information on landslide hazards, preferably in the form of probabilities, which will be combined with information on building stock, infrastructure and population density. The vulnerability of population and infrastructure will be taken into account by a weighting factor. The resulting risk maps will be of great value for local authorities, Comisión Permanente de Contingencias (COPECO) of Honduras, local GIS specialists, policy makers and reinsurance companies. We will show the results of the Service Definition Project with some examples of the new method especially for Tegucigalpa the capital of Honduras with approximately 1 million inhabitants.

  10. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    Science.gov (United States)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  11. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    CERN Document Server

    Choquet, É; Soummer, R; Perrin, M D; Hagan, J B; Gofas-Salas, E; Rajan, A; Aguilar, J

    2015-01-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  12. Safety assessment for the passive system of the nuclear power plants (NPPs) using safety margin estimation

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Tae-Ho; Lee, Un-Chul [Department of Nuclear Engineering, Seoul National University, Gwanak 599, Gwanak-ro, Gwanak-gu, Seoul 151-742 (Korea)

    2010-04-15

    The probabilistic safety assessment (PSA) for gas-cooled nuclear power plants has been investigated where the operational data are deficient, because there is not any commercial gas-cooled nuclear power plant. Therefore, it is necessary to use the statistical data for the basic event constructions. Several estimations for the safety margin are introduced for the quantification of the failure frequency in the basic event, which is made by the concept of the impact and affordability. Trend of probability of failure (TPF) and fuzzy converter (FC) are introduced using the safety margin, which shows the simplified and easy configurations for the event characteristics. The mass flow rate in the natural circulation is studied for the modeling. The potential energy in the gravity, the temperature and pressure in the heat conduction, and the heat transfer rate in the internal stored energy are also investigated. The values in the probability set are compared with those of the fuzzy set modeling. Non-linearity of the safety margin is expressed by the fuzziness of the membership function. This artificial intelligence analysis of the fuzzy set could enhance the reliability of the system comparing to the probabilistic analysis. (author)

  13. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Cabalin, L.M.; Gonzalez, A. [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain); Ruiz, J. [Department of Applied Physics I, University of Malaga, E-29071 Malaga (Spain); Laserna, J.J., E-mail: laserna@uma.e [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain)

    2010-08-15

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s{sup -1}. Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  14. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods......Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets...

  15. New statistical methodology, mathematical models, and data bases relevant to the assessment of health impacts of energy technologies

    International Nuclear Information System (INIS)

    The present research develops new statistical methodology, mathematical models, and data bases of relevance to the assessment of health impacts of energy technologies, and uses these to identify, quantify, and pedict adverse health effects of energy related pollutants. Efforts are in five related areas including: (1) evaluation and development of statistical procedures for the analysis of death rate data, disease incidence data, and large scale data sets; (2) development of dose response and demographic models useful in the prediction of the health effects of energy technologies; (3) application of our method and models to analyses of the health risks of energy production; (4) a reanalysis of the Tri-State leukemia survey data, focusing on the relationship between myelogenous leukemia risk and diagnostic x-ray exposure; and (5) investigation of human birth weights as a possible early warning system for the effects of environmental pollution

  16. A Participatory Approach to Develop the Power Mobility Screening Tool and the Power Mobility Clinical Driving Assessment Tool

    Directory of Open Access Journals (Sweden)

    Deepan C. Kamaraj

    2014-01-01

    Full Text Available The electric powered wheelchair (EPW is an indispensable assistive device that increases participation among individuals with disabilities. However, due to lack of standardized assessment tools, developing evidence based training protocols for EPW users to improve driving skills has been a challenge. In this study, we adopt the principles of participatory research and employ qualitative methods to develop the Power Mobility Screening Tool (PMST and Power Mobility Clinical Driving Assessment (PMCDA. Qualitative data from professional experts and expert EPW users who participated in a focus group and a discussion forum were used to establish content validity of the PMCDA and the PMST. These tools collectively could assess a user’s current level of bodily function and their current EPW driving capacity. Further multicenter studies are necessary to evaluate the psychometric properties of these tests and develop EPW driving training protocols based on these assessment tools.

  17. Statistical mechanics of aggregation in anisotropic solvents: kinetic energy of aggregates and universal power-law behavior far from criticality

    International Nuclear Information System (INIS)

    We propose and study analytically a statistical mechanical model of reversible aggregation in anisotropic and isotropic solvents for small solute concentrations c. An aggregate comprising n solute molecules is a one-dimensional structureless flexible rod, n-mer, which interacts with the solvent anisotropy. The solvent is a nematic liquid crystal described by its scalar order parameter. The kinetic energy of n-mers is shown to play a unique role in the thermodynamic equilibrium. The kinetic energy contribution to the partition function is modeled by the term nq, where q is determined by the persistence lengths of different translation–rotation modes (e.g. q = 5 for a rigid rod and q≈0 for a very flexible chain). The n-mer concentration is found to depend on c via its powers which are fully determined by the parameter q. The solvent anisotropy results in a larger fraction of longer aggregates and gives rise to two different aggregation regimes: a low n regime for lower solute concentration c and a high n regime for higher c. The total aggregate concentration is found to be a sum of universal power laws of c with the exponents that are different for anisotropic and isotropic solvents, but in both cases are determined solely by the parameter q. The analytical formulae for the two regimes and the crossover point (which can be naturally associated with the critical micelle concentration) are in a quantitative agreement with the numerical solution of the model. The model is pertinent to self-assemblies of plank-like dye molecules dissolved in an isotropic solvent (related to chromonic liquid crystals) and in a nematic liquid crystal

  18. The price of electricity from private power producers: Stage 2, Expansion of sample and preliminary statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Comnes, G.A.; Belden, T.N.; Kahn, E.P.

    1995-02-01

    The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.

  19. A statistical toolbox for metagenomics: assessing functional diversity in microbial communities

    Directory of Open Access Journals (Sweden)

    Handelsman Jo

    2008-01-01

    Full Text Available Abstract Background The 99% of bacteria in the environment that are recalcitrant to culturing have spurred the development of metagenomics, a culture-independent approach to sample and characterize microbial genomes. Massive datasets of metagenomic sequences have been accumulated, but analysis of these sequences has focused primarily on the descriptive comparison of the relative abundance of proteins that belong to specific functional categories. More robust statistical methods are needed to make inferences from metagenomic data. In this study, we developed and applied a suite of tools to describe and compare the richness, membership, and structure of microbial communities using peptide fragment sequences extracted from metagenomic sequence data. Results Application of these tools to acid mine drainage, soil, and whale fall metagenomic sequence collections revealed groups of peptide fragments with a relatively high abundance and no known function. When combined with analysis of 16S rRNA gene fragments from the same communities these tools enabled us to demonstrate that although there was no overlap in the types of 16S rRNA gene sequence observed, there was a core collection of operational protein families that was shared among the three environments. Conclusion The results of comparisons between the three habitats were surprising considering the relatively low overlap of membership and the distinctively different characteristics of the three habitats. These tools will facilitate the use of metagenomics to pursue statistically sound genome-based ecological analyses.

  20. Soil Quality Assessment in Different Land Uses Using Multivariate Statistical Analysis

    Directory of Open Access Journals (Sweden)

    W. Zarei

    2015-03-01

    Full Text Available The aim of the study was to investigate the effects of land use on soil quality parameters using multivariate statistical analysis. Soil samples (0-25 and 25-50 cm depths were taken from three land uses in forest area of Marivan including forest, rangeland, and cultivated land. Soil characteristics of pH, EC, sand, silt, clay and CaCO3 content, water-stable aggregates and their organic carbon content were measured. Principal component, cluster and discriminant analyses were used to evaluate the soil quality. Principal component analysis classified soil properties into five factors. The most important factors were soil aggregates organic carbon content and aggregate stability indices. Schematic distribution of factors and also cluster analysis showed the same pattern. Soil aggregates organic carbon content, water-stable aggregates and aggregate stability indices were the most sensitive factors to land use changes. These soil properties and factors had the same pattern in forest and rangeland, but significantly reduced in the cultivated land use. Land use change from forest to cultivated land resulted in significant decrease of aggregates organic carbon content, water-stable aggregates and also an increase of pH. The results showed the usefulness of multivariate statistical methods for integration of the soil properties and determination of different soil quality indices.