WorldWideScience

Sample records for assessment statistical power

  1. Assessment and statistics of Brazilian hydroelectric power plants: Dam areas versus installed and firm power

    International Nuclear Information System (INIS)

    Caetano de Souza, Antonio Carlos

    2008-01-01

    The Brazilian relief, predominantly composed by small mountains and plateaus, contributed to formation of rivers with high amount of falls. With exception to North-eastern Brazil, the climate of this country are rainy, which contributes to maintain water flows high. These elements are essential to a high hydroelectric potential, contributing to the choice of hydroelectric power plants as the main technology of electricity generation in Brazil. Though this is a renewable source, whose utilized resource is free, dams must to be established which generates a high environmental and social impact. The objective of this study is to evaluate the impact caused by these dams through the use of environmental indexes. These indexes are ratio formed by installed power with dam area of a hydro power plant, and ratio formed by firm power with this dam area. In this study, the greatest media values were found in South, Southeast, and Northeast regions respectively, and the smallest media values were found in North and Mid-West regions, respectively. The greatest encountered media indexes were also found in dams established in the 1950s. In the last six decades, the smallest indexes were registered by dams established in the 1980s. These indexes could be utilized as important instruments for environmental impact assessments, and could enable a dam to be established that depletes an ecosystem as less as possible. (author)

  2. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Directory of Open Access Journals (Sweden)

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  3. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Directory of Open Access Journals (Sweden)

    Frantál Bohumil

    2016-03-01

    Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.

  4. Power generation statistics

    International Nuclear Information System (INIS)

    Kangas, H.

    2001-01-01

    The frost in February increased the power demand in Finland significantly. The total power consumption in Finland during January-February 2001 was about 4% higher than a year before. In January 2001 the average temperature in Finland was only about - 4 deg C, which is nearly 2 degrees higher than in 2000 and about 6 degrees higher than long term average. Power demand in January was slightly less than 7.9 TWh, being about 0.5% less than in 2000. The power consumption in Finland during the past 12 months exceeded 79.3 TWh, which is less than 2% higher than during the previous 12 months. In February 2001 the average temperature was - 10 deg C, which was about 5 degrees lower than in February 2000. Because of this the power consumption in February 2001 increased by 5%. Power consumption in February was 7.5 TWh. The maximum hourly output of power plants in Finland was 13310 MW. Power consumption of Finnish households in February 2001 was about 10% higher than in February 2000, and in industry the increase was nearly zero. The utilization rate in forest industry in February 2001 decreased from the value of February 2000 by 5%, being only about 89%. The power consumption of the past 12 months (Feb. 2000 - Feb. 2001) was 79.6 TWh. Generation of hydroelectric power in Finland during January - February 2001 was 10% higher than a year before. The generation of hydroelectric power in Jan. - Feb. 2001 was nearly 2.7 TWh, corresponding to 17% of the power demand in Finland. The output of hydroelectric power in Finland during the past 12 months was 14.7 TWh. The increase from the previous 12 months was 17% corresponding to over 18% of the power demand in Finland. Wind power generation in Jan. - Feb. 2001 was exceeded slightly 10 GWh, while in 2000 the corresponding output was 20 GWh. The degree of utilization of Finnish nuclear power plants in Jan. - Feb. 2001 was high. The output of these plants was 3.8 TWh, being about 1% less than in Jan. - Feb. 2000. The main cause for the

  5. Nuclear power statistics 1985

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1986-06-01

    In this report an attempt is made to collect literature data on nuclear power production and to present it on graphical form. Data is given not only for 1985, but for a number of years so that the trends in the development of nuclear power can be seen. The global capacity of nuclear power plants in operation and those in operation, under construction, or on order is considered. Further the average capacity factor for nuclear plants of a specific type and for various geographical areas is given. The contribution of nuclear power to the total electricity production is considered for a number of countries and areas. Finally, the accumulated years of commercial operation for the various reactor types up to the end of 1985 is presented. (author)

  6. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  7. DISTRIBUTED GRID-CONNECTED PHOTOVOLTAIC POWER SYSTEM EMISSION OFFSET ASSESSMENT: STATISTICAL TEST OF SIMULATED- AND MEASURED-BASED DATA

    Science.gov (United States)

    This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...

  8. Electric power statistics from independence to establishment

    International Nuclear Information System (INIS)

    1997-02-01

    This paper reports power statistics from independence to establishment pf KEPIC. It has the lists of electricity industry, electric equipment on the whole country power equipment at the independence and development of power facility, power generation about merit of power plants, demand according to types and use, power loss, charge for electric power distribution, power generation and generating cost, financial lists on income measurement and financing, meteorological phenomena and amount of rainfall electric power development, international statistics on major countries power generation and compare power rates with general price.

  9. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  10. Statistical modeling to support power system planning

    Science.gov (United States)

    Staid, Andrea

    This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate

  11. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Czech Academy of Sciences Publication Activity Database

    Frantál, Bohumil; Malý, Jiří; Ouředníček, M.; Nemeškal, J.

    2016-01-01

    Roč. 24, č. 1 (2016), s. 2-13 ISSN 1210-8812 R&D Projects: GA MŠk(CZ) EE2.3.20.0025 Institutional support: RVO:68145535 Keywords : nuclear power plant impacts * spatial analysis * risk perceptions Subject RIV: DE - Earth Magnetism, Geodesy, Geography Impact factor: 2.149, year: 2016 http://www.degruyter.com/view/j/mgr.2016.24.issue-1/mgr-2016-0001/mgr-2016-0001.xml?format=INT

  12. Swiss solar power statistics 2007 - Significant expansion

    International Nuclear Information System (INIS)

    Hostettler, T.

    2008-01-01

    This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.

  13. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation

  14. Statistical Power in Plant Pathology Research.

    Science.gov (United States)

    Gent, David H; Esker, Paul D; Kriss, Alissa B

    2018-01-01

    In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even

  15. Statistical Power in Longitudinal Network Studies

    NARCIS (Netherlands)

    Stadtfeld, Christoph; Snijders, Tom A. B.; Steglich, Christian; van Duijn, Marijtje

    2018-01-01

    Longitudinal social network studies may easily suffer from a lack of statistical power. This is the case in particular for studies that simultaneously investigate change of network ties and change of nodal attributes. Such selection and influence studies have become increasingly popular due to the

  16. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1989-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  17. Statistical power and the Rorschach: 1975-1991.

    Science.gov (United States)

    Acklin, M W; McDowell, C J; Orndoff, S

    1992-10-01

    The Rorschach Inkblot Test has been the source of long-standing controversies as to its nature and its psychometric properties. Consistent with behavioral science research in general, the concept of statistical power has been entirely ignored by Rorschach researchers. The concept of power is introduced and discussed, and a power survey of the Rorschach literature published between 1975 and 1991 in the Journal of Personality Assessment, Journal of Consulting and Clinical Psychology, Journal of Abnormal Psychology, Journal of Clinical Psychology, Journal of Personality, Psychological Bulletin, American Journal of Psychiatry, and Journal of Personality and Social Psychology was undertaken. Power was calculated for 2,300 statistical tests in 158 journal articles. Power to detect small, medium, and large effect sizes was .13, .56, and .85, respectively. Similar to the findings in other power surveys conducted on behavioral science research, we concluded that Rorschach research is underpowered to detect the differences under investigation. This undoubtedly contributes to the inconsistency of research findings which has been a source of controversy and criticism over the decades. It appears that research conducted according to the Comprehensive System for the Rorschach is more powerful. Recommendations are offered for improving power and strengthening the design sensitivity of Rorschach research, including increasing sample sizes, use of parametric statistics, reduction of error variance, more accurate reporting of findings, and editorial policies reflecting concern about the magnitude of relationships beyond an exclusive focus on levels of statistical significance.

  18. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1990-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)

  19. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  20. Wind Power Statistics Sweden 2009; Vindkraftstatistik 2009

    Energy Technology Data Exchange (ETDEWEB)

    2010-04-15

    In 2009, wind power produced 2.5 TWh, an increase of 26 percent over the previous year. Throughout the period 2003-2009 has production of electricity from wind power almost quadrupled. Sweden's total net production of electricity amounted, according to provisional statistics for 2009, to 133.7 TWh. The year 2007 wind energy's share passed 1.0 percent of total net production of electricity for the first time. In 2008 the proportion was 1.4 percent, and in 2009 to almost 1.9 percent of total net production. Total installed power 2009 was 1448 MW and the number of plants was 1359, an inckW{sub pse} with 363 MW and 198 resp. from 2008. In 2009, there were three main support system for wind power in Sweden: the certificate system; the wind pilot project; and the environmental bonus. The electricity certificate system is a market-based support system for electricity generation from renewables which includes wind power as one of the approved techniques. The system was introduced in 2003 and aims to increase the production of electricity from renewable energy sources by 25 TWh from 2002 levels by 2020.. Wind pilot support is a support to the market for large-scale wind power. Support aims to reduce the cost of the creation of new wind energy and promoting new technologies. Wind Pilot Aid, which has existed since 2003, has been extended until in 2012 and has increased by 350 million SEK (about 36 M Euro) for the period 2008-2012. The environmental bonus, which means a tax subsidy, has been stepped down for each year until and by the year 2009, which was the last year. In 2009, environmental bonus was 0.12 SEK/kWh for electricity from offshore wind. For onshore wind power the environmentally bonus ceased in 2008

  1. Assessment and statistics of surgically induced astigmatism.

    Science.gov (United States)

    Naeser, Kristian

    2008-05-01

    The aim of the thesis was to develop methods for assessment of surgically induced astigmatism (SIA) in individual eyes, and in groups of eyes. The thesis is based on 12 peer-reviewed publications, published over a period of 16 years. In these publications older and contemporary literature was reviewed(1). A new method (the polar system) for analysis of SIA was developed. Multivariate statistical analysis of refractive data was described(2-4). Clinical validation studies were performed. The description of a cylinder surface with polar values and differential geometry was compared. The main results were: refractive data in the form of sphere, cylinder and axis may define an individual patient or data set, but are unsuited for mathematical and statistical analyses(1). The polar value system converts net astigmatisms to orthonormal components in dioptric space. A polar value is the difference in meridional power between two orthogonal meridians(5,6). Any pair of polar values, separated by an arch of 45 degrees, characterizes a net astigmatism completely(7). The two polar values represent the net curvital and net torsional power over the chosen meridian(8). The spherical component is described by the spherical equivalent power. Several clinical studies demonstrated the efficiency of multivariate statistical analysis of refractive data(4,9-11). Polar values and formal differential geometry describe astigmatic surfaces with similar concepts and mathematical functions(8). Other contemporary methods, such as Long's power matrix, Holladay's and Alpins' methods, Zernike(12) and Fourier analyses(8), are correlated to the polar value system. In conclusion, analysis of SIA should be performed with polar values or other contemporary component systems. The study was supported by Statens Sundhedsvidenskabeligt Forskningsråd, Cykelhandler P. Th. Rasmussen og Hustrus Mindelegat, Hotelejer Carl Larsen og Hustru Nicoline Larsens Mindelegat, Landsforeningen til Vaern om Synet

  2. Forecasting winds over nuclear power plants statistics

    International Nuclear Information System (INIS)

    Marais, Ch.

    1997-01-01

    In the event of an accident at nuclear power plant, it is essential to forecast the wind velocity at the level where the efflux occurs (about 100 m). At present meteorologists refine the wind forecast from the coarse grid of numerical weather prediction (NWP) models. The purpose of this study is to improve the forecasts by developing a statistical adaptation method which corrects the NWP forecasts by using statistical comparisons between wind forecasts and observations. The Multiple Linear Regression method is used here to forecast the 100 m wind at 12 and 24 hours range for three Electricite de France (EDF) sites. It turns out that this approach gives better forecasts than the NWP model alone and is worthy of operational use. (author)

  3. How Should We Assess the Fit of Rasch-Type Models? Approximating the Power of Goodness-of-Fit Statistics in Categorical Data Analysis

    Science.gov (United States)

    Maydeu-Olivares, Alberto; Montano, Rosa

    2013-01-01

    We investigate the performance of three statistics, R [subscript 1], R [subscript 2] (Glas in "Psychometrika" 53:525-546, 1988), and M [subscript 2] (Maydeu-Olivares & Joe in "J. Am. Stat. Assoc." 100:1009-1020, 2005, "Psychometrika" 71:713-732, 2006) to assess the overall fit of a one-parameter logistic model…

  4. Power performance assessment. Final report

    International Nuclear Information System (INIS)

    Frandsen, S.

    1998-12-01

    In the increasingly commercialised wind power marketplace, the lack of precise assessment methods for the output of an investment is becoming a barrier for wider penetration of wind power. Thus, addressing this problem, the overall objectives of the project are to reduce the financial risk in investment in wind power projects by significantly improving the power performance assessment methods. Ultimately, if this objective is successfully met, the project may also result in improved tuning of the individual wind turbines and in optimisation methods for wind farm operation. The immediate, measurable objectives of the project are: To prepare a review of existing contractual aspects of power performance verification procedures of wind farms; to provide information on production sensitivity to specific terrain characteristics and wind turbine parameters by analyses of a larger number of wind farm power performance data available to the proposers; to improve the understanding of the physical parameters connected to power performance in complex environment by comparing real-life wind farm power performance data with 3D computational flow models and 3D-turbulence wind turbine models; to develop the statistical framework including uncertainty analysis for power performance assessment in complex environments; and to propose one or more procedures for power performance evaluation of wind power plants in complex environments to be applied in contractual agreements between purchasers and manufacturers on production warranties. Although the focus in this project is on power performance assessment the possible results will also be of benefit to energy yield forecasting, since the two tasks are strongly related. (au) JOULE III. 66 refs.; In Co-operation Renewable Energy System Ltd. (GB); Centre for Renewable Energy (GR); Aeronautic Research Centre (SE); National Engineering Lab. (GB); Public Power Cooperation (GR)

  5. Statistical operation of nuclear power plants

    International Nuclear Information System (INIS)

    Gauzit, Maurice; Wilmart, Yves

    1976-01-01

    A comparison of the statistical operating results of nuclear power stations as issued in the literature shows that the values given for availability and the load factor often differ considerably from each other. This may be due to different definitions given to these terms or even to a poor translation from one language into another. A critical analysis of these terms as well as the choice of a parameter from which it is possible to have a quantitative idea of the actual quality of the operation obtained is proposed. The second section gives, on an homogenous basis and from the results supplied by 83 nuclear power stations now in operation, a statistical analysis of their operating results: in particular, the two light water lines, during 1975, as well as the evolution in terms of age, of the units or the starting conditions of the units during their first two operating years. Test values thus obtained are compared also to those taken 'a priori' as hypothesis in some economic studies [fr

  6. Power quality assessment

    International Nuclear Information System (INIS)

    Fathi, H.M.E.

    2012-01-01

    The electrical power systems are exposed to different types of power quality disturbances problems. Assessment of power quality is necessary for maintaining accurate operation of sensitive equipment's especially for nuclear installations, it also ensures that unnecessary energy losses in a power system are kept at a minimum which lead to more profits. With advanced in technology growing of industrial / commercial facilities in many region. Power quality problems have been a major concern among engineers; particularly in an industrial environment, where there are many large-scale type of equipment. Thus, it would be useful to investigate and mitigate the power quality problems. Assessment of Power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. The choice of monitoring equipment in a survey is also important to ascertain a solution to these power quality problems. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards); then, if problems exist, recommendation of mitigation techniques must be considered. The main objective of the present work is to investigate and mitigate of power quality problems in nuclear installations. Normally electrical power is supplied to the installations via two sources to keep good reliability. Each source is designed to carry the full load. The Assessment of power quality was performed at the nuclear installations for both sources at different operation conditions. The thesis begins with a discussion of power quality definitions and the results of previous studies in power quality monitoring. The assessment determines that one source of electricity was deemed to have relatively good power quality; there were several disturbances, which exceeded the thresholds. Among of them are fifth harmonic, voltage swell, overvoltage and flicker. While the second

  7. Evaluating and Reporting Statistical Power in Counseling Research

    Science.gov (United States)

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  8. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    Science.gov (United States)

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  9. Power and environmental assessment

    DEFF Research Database (Denmark)

    Cashmore, Matthew Asa; Richardson, Tim

    2013-01-01

    The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently...

  10. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  11. Potential for accidents in a nuclear power plant: probabilistic risk assessment, applied statistical decision theory, and implications of such considerations to mathematics education

    International Nuclear Information System (INIS)

    Dios, R.A.

    1984-01-01

    This dissertation focuses upon the field of probabilistic risk assessment and its development. It investigates the development of probabilistic risk assessment in nuclear engineering. To provide background for its development, the related areas of population dynamics (demography), epidemiology and actuarial science are studied by presenting information upon how risk has been viewed in these areas over the years. A second major problem involves presenting an overview of the mathematical models related to risk analysis to mathematics educators and making recommendations for presenting this theory in classes of probability and statistics for mathematics and engineering majors at the undergraduate and graduate levels

  12. Statistical assessment of numerous Monte Carlo tallies

    International Nuclear Information System (INIS)

    Kiedrowski, Brian C.; Solomon, Clell J.

    2011-01-01

    Four tests are developed to assess the statistical reliability of collections of tallies that number in thousands or greater. To this end, the relative-variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality. (author)

  13. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  14. Statistical methods in personality assessment research.

    Science.gov (United States)

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  15. Nuclear power plants: 2004 atw compact statistics

    International Nuclear Information System (INIS)

    Anon.

    2005-01-01

    In late 2004, nuclear power plants were available for power supply or were under construction in 32 countries worldwide. A total of 441 nuclear power plants, i.e. two plants more than in late 2003, were in operation with an aggregate gross power of approx. 386 GWe and an aggregate net power, respectively, of 362 GWe, in 31 countries. The available capacity of nuclear power plants increased by approx. 5 GWe as a result of the additions by the six units newly commissioned: Hamaoka 5 (Japan), Ulchin 6 (Korea), Kalinin 3 (Russia), Khmelnitski 2 (Ukraine), Qinshan II-2 (People's Republic of China), and Rowno 4 (Ukraine). In addition, unit 3 of the Bruce A nuclear power plant in Canada with a power of 825 MWe was restarted after an outage of many years. Contrary to earlier plans, a recommissioning program was initiated for the Bruce A-1 and A-2 units, which are also down at present. Five plants were decommissioned for good in 2004; Chapelcross 1 to 4 with 50 MWe each in the United Kingdom, and Ignalina 1 with 1 300 MWe in Lithuania. 22 nuclear generating units with an aggregate gross power of 19 GWe in nine countries were under construction in late 2004. In India, construction work was started on a new project, the 500 MWe PFBR prototype fast breeder reactor. In France, the EDF utility announced its intention to build an EPR on the Flamanville site beginning in 2007. (orig.)

  16. Nuclear power plants: 2013 atw compact statistics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2014-03-15

    At the end of 2013, nuclear power plants were available for energy supply in 31 countries of the world. A total of 437 nuclear power plants were in operation with an aggregate gross power of approx. 393 GWe and an aggregate net power, respectively, of 372 GWe. This means that the number was unchanged compared to the previous year's number on 31 December 2012. The available gross power of nuclear power plants increased by approx. 2 GWe from 2012 to the end of 2013. In total 4 nuclear generating units were commissioned in 2013 in China (+2) and in the Republic Korea (+1). 6 nuclear generating units were decommissioned in 2013. Four units in the U.S.A. (-4) were shut down due to economical reasons. In Canada (-2) the operation status of 2 units was changed from long-term shutdown to permanently shutdown. 70 nuclear generating units with an aggregate gross power of approx. 73 GWe, were under construction in 15 countries end of 2013. New or continued projects are notified from (in brackets: number of new projects) China (+3), Belarus (+1), Rep. of Korea (+1) and the United Arab Emirates (+1). Some 115 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another units are in their preliminary project phases. (orig.)

  17. Nuclear power plants: 2009 atw compact statistics

    International Nuclear Information System (INIS)

    Anon.

    2010-01-01

    At the turn of 2009/2010, nuclear power plants were available for energy supply in 30 countries of the world. A total of 437 nuclear power plants, which is one plant less than at the 2008/2009 turn, were in operation with an aggregate gross power of approx. 391 GWe and an aggregate net power, respectively, of 371 GWe. The available gross power of nuclear power plants did not changed noticeably from 2008 to the end of 2009. In total 2 nuclear generating units were commissioned in 2009. One NPP started operation in India and one in Japan. Three nuclear generating units in Japan (2) und Lithuania (1) were decomissioned in 2009. 52 nuclear generating units, i.e. 10 plants more than at the end of 2008, with an aggregate gross power of approx. 51 GWe, were under construction in 14 countries end of 2009. New or continued projects are notified from (number of new projects): China (+9), Russia (1), and South Korea (1). Some 84 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another units are in their preliminary project phases. (orig.)

  18. Nuclear power plants: 2013 atw compact statistics

    International Nuclear Information System (INIS)

    Anon.

    2014-01-01

    At the end of 2013, nuclear power plants were available for energy supply in 31 countries of the world. A total of 437 nuclear power plants were in operation with an aggregate gross power of approx. 393 GWe and an aggregate net power, respectively, of 372 GWe. This means that the number was unchanged compared to the previous year's number on 31 December 2012. The available gross power of nuclear power plants increased by approx. 2 GWe from 2012 to the end of 2013. In total 4 nuclear generating units were commissioned in 2013 in China (+2) and in the Republic Korea (+1). 6 nuclear generating units were decommissioned in 2013. Four units in the U.S.A. (-4) were shut down due to economical reasons. In Canada (-2) the operation status of 2 units was changed from long-term shutdown to permanently shutdown. 70 nuclear generating units with an aggregate gross power of approx. 73 GWe, were under construction in 15 countries end of 2013. New or continued projects are notified from (in brackets: number of new projects) China (+3), Belarus (+1), Rep. of Korea (+1) and the United Arab Emirates (+1). Some 115 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another units are in their preliminary project phases. (orig.)

  19. Nuclear power plants: 2005 atw compact statistics

    International Nuclear Information System (INIS)

    Anon.

    2006-01-01

    Nuclear power plants were available for power supply and under construction, respectively, in 32 countries of the world as per end of 2005. A total of 444 nuclear power plants, i.e. three plants more than at the end of 2004, with an aggregate gross power of approx. 389 GWe and an aggregate net power of 370 GWe, respectively, were in operation in 31 countries. The available capacity of nuclear power plants increased by some 4,5 GWe as a result of the capacities added by the four newly commissioned units of Higashidori 1 (Japan), Shika 2 (Japan), Tarapur 4 (India), and Tianwan 1 (China). In addition, unit A-1 of the Pickering nuclear power station in Canada, with 825 MWe, was restarted after a downtime of several years. Two plants were decommissioned for good in 2005: Obrigheim in Germany, and Barsebaeck 2 in Sweden. 23 nuclear generating units, i.e. one unit more than in late 2004, with an aggregate gross power of approx. 19 GWe were still under construction in nine countries by late 2005. In Pakistan, construction of a new project, Chasnupp 2, was started; in China, construction was begun of two units, Lingao Phase 2, units 3 and 4, and in Japan, the Shimane 3 generating unit is being built. (orig.)

  20. Nuclear power plants: 2008 atw compact statistics

    International Nuclear Information System (INIS)

    Anon.

    2009-01-01

    At the turn of 2008/2009, nuclear power plants were available for energy supply in 31 countries of the world. A total of 438 nuclear power plants, which is one plant less than at the 2007/2008 turn, were in operation with an aggregate gross power of approx. 393 GWe and an aggregate net power, respectively, of 372 GWe. The available gross power of nuclear power plants didn't changed noticeabely from 2007 to the end of 2008. No nuclear generating unit was commissioned in 2008. One nuclear generating unit in the Slovak Republic was decomissioned in 2008. 42 nuclear generating units, i.e. 10 plants more than at the end of 2007, with an aggregate gross power of approx. 38 GWe, were under construction in 14 countries end of 2008. New or continued projects are notified from (in brackets: number of new projects): Bulgaria (2), China (5), South Korea (2), Russia (1), and the Slovak Republic (2). Some 80 new nuclear power plants are in the concrete project design, planning and licensing phases worldwide; on some of them, contracts have already been awarded. Another approximately 120 units are in their preliminary project phases. (orig.)

  1. Statistical aspects of fish stock assessment

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte

    for stock assessment by application of state-of-the-art statistical methodology. The main contributions are presented in the form of six research papers. The major part of the thesis deals with age-structured assessment models, which is the most common approach. Conversion from length to age distributions...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods...

  2. IAEA releases nuclear power statistics for 2000

    International Nuclear Information System (INIS)

    2001-01-01

    According to data reported to the IAEA Power Reactor Information System, a total of 438 NPPs were operating around the world at the end of 2000. The total installed power from NPPs was 351 GWe. During 2000, six plants were connected to the grid, construction of three new nuclear reactors started, bringing the total number of reactors under construction to 31. Worldwide in 2000, total nuclear generated electricity increased to 2447.53 terawatt-hours. Cumulative worldwide operating experience from civil nuclear power reactors at the end of 2000 exceeded 9800 reactor years

  3. Statistical power analysis for the behavioral sciences

    National Research Council Canada - National Science Library

    Cohen, Jacob

    1988-01-01

    .... A chapter has been added for power analysis in set correlation and multivariate methods (Chapter 10). Set correlation is a realization of the multivariate general linear model, and incorporates the standard multivariate methods...

  4. Statistical power analysis for the behavioral sciences

    National Research Council Canada - National Science Library

    Cohen, Jacob

    1988-01-01

    ... offers a unifying framework and some new data-analytic possibilities. 2. A new chapter (Chapter 11) considers some general topics in power analysis in more integrted form than is possible in the earlier...

  5. Statistical tests for power-law cross-correlated processes

    Science.gov (United States)

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.

  6. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  7. The power and statistical behaviour of allele-sharing statistics when ...

    Indian Academy of Sciences (India)

    , using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single ...

  8. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  9. When Mathematics and Statistics Collide in Assessment Tasks

    Science.gov (United States)

    Bargagliotti, Anna; Groth, Randall

    2016-01-01

    Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.

  10. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Science.gov (United States)

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  11. Replication unreliability in psychology: elusive phenomena or elusive statistical power?

    Directory of Open Access Journals (Sweden)

    Patrizio E Tressoldi

    2012-07-01

    Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.

  12. Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows

    DEFF Research Database (Denmark)

    Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco

    2012-01-01

    In view of the increasing penetration of wind power in a number of power systems and markets worldwide, we discuss some of the impacts that wind energy may have on market quantities and cross-border power flows. These impacts are uncovered through statistical analyses of actual market and flow data...... of load and wind power forecasts on Danish and German electricity markets....

  13. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. The power and robustness of maximum LOD score statistics.

    Science.gov (United States)

    Yoo, Y J; Mendell, N R

    2008-07-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value. In this paper, we compare the power of maximum LOD scores based on three fixed sets of genetic parameter values with the power of the LOD score obtained after maximizing over the entire range of genetic parameter values. We simulate family data under nine generating models. For generating models with non-zero phenocopy rates, LOD scores maximized over the entire range of genetic parameters yielded greater power than maximum LOD scores for fixed sets of parameter values with zero phenocopy rates. No maximum LOD score was consistently more powerful than the others for generating models with a zero phenocopy rate. The power loss of the LOD score maximized over the entire range of genetic parameters, relative to the maximum LOD score calculated using the correct genetic parameter value, appeared to be robust to the generating models.

  15. Assessing the need for power

    International Nuclear Information System (INIS)

    Chern, W.S.; Just, R.E.

    1982-01-01

    The growing controversy over nuclear power has demanded a critical evaluation of the need for power to justify proposed nuclear power plants. This paper discusses the use of an econometric model developed for the US Nuclear Regulatory Commission to conduct an independent assessment of electricity demand forecasts related to the licensing of nuclear power plants. The model forecasts electricity demand and price by sector and by state. The estimation and forecasting results for the New England region are presented as a case in point where an econometric model has been used to analyse alternative fuel price scenarios and to aid substantive public decision making regarding new nuclear power plant decisions. (author)

  16. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Wendelberger, J.R.; McVittie, T.I.

    1995-01-01

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  17. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  18. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  19. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  20. Statistic method of research reactors maximum permissible power calculation

    International Nuclear Information System (INIS)

    Grosheva, N.A.; Kirsanov, G.A.; Konoplev, K.A.; Chmshkyan, D.V.

    1998-01-01

    The technique for calculating maximum permissible power of a research reactor at which the probability of the thermal-process accident does not exceed the specified value, is presented. The statistical method is used for the calculations. It is regarded that the determining function related to the reactor safety is the known function of the reactor power and many statistically independent values which list includes the reactor process parameters, geometrical characteristics of the reactor core and fuel elements, as well as random factors connected with the reactor specific features. Heat flux density or temperature is taken as a limiting factor. The program realization of the method discussed is briefly described. The results of calculating the PIK reactor margin coefficients for different probabilities of the thermal-process accident are considered as an example. It is shown that the probability of an accident with fuel element melting in hot zone is lower than 10 -8 1 per year for the reactor rated power [ru

  1. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  2. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  3. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  4. The assessment of fusion power

    International Nuclear Information System (INIS)

    Bickerton, Roy

    1990-01-01

    It is argued that the recent 'Science and Technology Options Assessments' of fusion power produced for the European Parliament is incorrecta and misleading. The report takes no account of the complex organizational structure of the European fusion programme, it misrepresents history, and it presents incomprehensible graphical evidence and criteria which are narrowly-based and largely platitudinous. (author)

  5. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    Science.gov (United States)

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  6. Demographic statistics pertaining to nuclear power reactor sites

    International Nuclear Information System (INIS)

    1979-10-01

    Population statistics are presented for 145 nuclear power plant sites. Summary tables and figures are included that were developed to aid in the evaluation of trends and general patterns associated with the various parameters of interest, such as the proximity of nuclear plant sites to centers of population. The primary reason for publishing this information at this time is to provide a factual basis for use in discussions on the subject of reactor siting policy. The report is a revised and updated version of a draft report published in December 1977. Errors in the population data base have been corrected and new data tabulations added

  7. Multivariate statistical assessment of coal properties

    Czech Academy of Sciences Publication Activity Database

    Klika, Z.; Serenčíšová, J.; Kožušníková, Alena; Kolomazník, I.; Študentová, S.; Vontorová, J.

    2014-01-01

    Roč. 128, č. 128 (2014), s. 119-127 ISSN 0378-3820 R&D Projects: GA MŠk ED2.1.00/03.0082 Institutional support: RVO:68145535 Keywords : coal properties * structural,chemical and petrographical properties * multivariate statistics Subject RIV: DH - Mining, incl. Coal Mining Impact factor: 3.352, year: 2014 http://dx.doi.org/10.1016/j.fuproc.2014.06.029

  8. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A; Giebel, G; Landberg, L [Risoe National Lab., Roskilde (Denmark); Madsen, H; Nielsen, H A [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  9. Risk assessment and nuclear power

    International Nuclear Information System (INIS)

    Bodansky, D.

    1982-01-01

    The range of risk perceptions involving nuclear power is so great that there is little hope of bridging extreme positions, but a consensus based upon reasoned discussion among uncommitted people could determine a sensible path. Our concerns over the uncertainties of risk assessment have made it increasingly difficult to make responsible decisions fast enough to deal with modern needs. The result is an immobility in energy matters that can point to a 2% reduction in oil use as its only triumph. The risk of nuclear war as a result of military action over energy issues suggests to some that the solution is to abolish nuclear power (however impractical) and to others that a rapid spread of nuclear power will eliminate energy as an incentive for war. If nuclear war is the major risk to consider, risk assessments need to include the risks of war, as well as those of carbon dioxide buildup and socio-economic disruptions, all of which loom larger than the risks of nuclear-plant accidents. Energy choices should be aimed at diminishing these major risks, even if they include the use of nuclear power. 26 references

  10. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  11. Statistical reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1997-01-01

    Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)

  12. Statistical learning: a powerful mechanism that operates by mere exposure.

    Science.gov (United States)

    Aslin, Richard N

    2017-01-01

    How do infants learn so rapidly and with little apparent effort? In 1996, Saffran, Aslin, and Newport reported that 8-month-old human infants could learn the underlying temporal structure of a stream of speech syllables after only 2 min of passive listening. This demonstration of what was called statistical learning, involving no instruction, reinforcement, or feedback, led to dozens of confirmations of this powerful mechanism of implicit learning in a variety of modalities, domains, and species. These findings reveal that infants are not nearly as dependent on explicit forms of instruction as we might have assumed from studies of learning in which children or adults are taught facts such as math or problem solving skills. Instead, at least in some domains, infants soak up the information around them by mere exposure. Learning and development in these domains thus appear to occur automatically and with little active involvement by an instructor (parent or teacher). The details of this statistical learning mechanism are discussed, including how exposure to specific types of information can, under some circumstances, generalize to never-before-observed information, thereby enabling transfer of learning. WIREs Cogn Sci 2017, 8:e1373. doi: 10.1002/wcs.1373 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  13. Prediction of lacking control power in power plants using statistical models

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob

    2007-01-01

    Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...... errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance...

  14. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  15. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial

  16. HVDC power transmission technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, R.L.; Tatro, P.J.; Railing, B.D. [New England Power Service Co., Westborough, MA (United States); Johnson, B.K.; Stewart, J.R. [Power Technologies, Inc., Schenectady, NY (United States); Fink, J.L.

    1997-04-01

    The purpose of this study was to develop an assessment of the national utility system`s needs for electric transmission during the period 1995-2020 that could be met by future reduced-cost HVDC systems. The assessment was to include an economic evaluation of HVDC as a means for meeting those needs as well as a comparison with competing technologies such as ac transmission with and without Flexible AC Transmission System (FACTS) controllers. The role of force commutated dc converters was to be assumed where appropriate. The assessment begins by identifying the general needs for transmission in the U.S. in the context of a future deregulated power industry. The possible roles for direct current transmission are then postulated in terms of representative scenarios. A few of the scenarios are illustrated with the help of actual U.S. system examples. non-traditional applications as well as traditional applications such as long lines and asynchronous interconnections are discussed. The classical ``break-even distance`` concept for comparing HVDC and ac lines is used to assess the selected scenarios. The impact of reduced-cost converters is reflected in terms of the break-even distance. This report presents a comprehensive review of the functional benefits of HVDC transmission and updated cost data for both ac and dc system components. It also provides some provocative thoughts on how direct current transmission might be applied to better utilize and expand our nation`s increasingly stressed transmission assets.

  17. Assessment of alternatives to correct inventory difference statistical treatment deficiencies

    International Nuclear Information System (INIS)

    Byers, K.R.; Johnston, J.W.; Bennett, C.A.; Brouns, R.J.; Mullen, M.F.; Roberts, F.P.

    1983-11-01

    This document presents an analysis of alternatives to correct deficiencies in the statistical treatment of inventory differences in the NRC guidance documents and licensee practice. Pacific Northwest Laboratory's objective for this study was to assess alternatives developed by the NRC and a panel of safeguards statistical experts. Criteria were developed for the evaluation and the assessment was made considering the criteria. The results of this assessment are PNL recommendations, which are intended to provide NRC decision makers with a logical and statistically sound basis for correcting the deficiencies

  18. Self-assessed performance improves statistical fusion of image labels

    International Nuclear Information System (INIS)

    Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.

    2014-01-01

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  19. Self-assessed performance improves statistical fusion of image labels

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2014-03-15

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  20. Statistical analysis about corrosion in nuclear power plants

    International Nuclear Information System (INIS)

    Naquid G, C.; Medina F, A.; Zamora R, L.

    1999-01-01

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  1. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  2. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S; Antoniou, I; Dahlberg, J A [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  3. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  4. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2013-01-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  5. Statistical problems in the assessment of nuclear risks

    International Nuclear Information System (INIS)

    Easterling, R.G.

    1980-01-01

    Information on nuclear power plant risk assessment is presented concerning attitudinal problems; and methodological problems involving expert opinions, human error probabilities, nonindependent events, uncertainty analysis, and acceptable risk criteria

  6. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  7. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  8. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    Science.gov (United States)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will

  9. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...

  10. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  11. Nuclear power plant insurance - experience and loss statistics

    International Nuclear Information System (INIS)

    Feldmann, J.; Dangelmaier, P.

    1982-01-01

    Nuclear power plants are treated separately when concluding insurance contracts. National insurance pools have been established in industrial countries, co-operating on an international basis, for insuring a nuclear power plant. In combined property insurance, the nuclear risk is combined with the fire risk. In addition, there are the engineering insurances. Of these, the one of significance for nuclear power plants is the machinery insurance, which can be covered on the free insurance market. Nuclear power plants have had fewer instances of damage than other, conventional installations. (orig.) [de

  12. Environmental Assessment for power marketing policy for Southwestern Power Administration

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described.

  13. Environmental Assessment for power marketing policy for Southwestern Power Administration

    International Nuclear Information System (INIS)

    1993-01-01

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described

  14. Statistical Power Analysis with Missing Data A Structural Equation Modeling Approach

    CERN Document Server

    Davey, Adam

    2009-01-01

    Statistical power analysis has revolutionized the ways in which we conduct and evaluate research.  Similar developments in the statistical analysis of incomplete (missing) data are gaining more widespread applications. This volume brings statistical power and incomplete data together under a common framework, in a way that is readily accessible to those with only an introductory familiarity with structural equation modeling.  It answers many practical questions such as: How missing data affects the statistical power in a study How much power is likely with different amounts and types

  15. Application of nonparametric statistics to material strength/reliability assessment

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-01-01

    An advanced material technology requires data base on a wide variety of material behavior which need to be established experimentally. It may often happen that experiments are practically limited in terms of reproducibility or a range of test parameters. Statistical methods can be applied to understanding uncertainties in such a quantitative manner as required from the reliability point of view. Statistical assessment involves determinations of a most probable value and the maximum and/or minimum value as one-sided or two-sided confidence limit. A scatter of test data can be approximated by a theoretical distribution only if the goodness of fit satisfies a test criterion. Alternatively, nonparametric statistics (NPS) or distribution-free statistics can be applied. Mathematical procedures by NPS are well established for dealing with most reliability problems. They handle only order statistics of a sample. Mathematical formulas and some applications to engineering assessments are described. They include confidence limits of median, population coverage of sample, required minimum number of a sample, and confidence limits of fracture probability. These applications demonstrate that a nonparametric statistical estimation is useful in logical decision making in the case a large uncertainty exists. (author)

  16. Evaluating the statistical power of DNA-based identification, exemplified by 'The missing grandchildren of Argentina'.

    Science.gov (United States)

    Kling, Daniel; Egeland, Thore; Piñero, Mariana Herrera; Vigeland, Magnus Dehli

    2017-11-01

    Methods and implementations of DNA-based identification are well established in several forensic contexts. However, assessing the statistical power of these methods has been largely overlooked, except in the simplest cases. In this paper we outline general methods for such power evaluation, and apply them to a large set of family reunification cases, where the objective is to decide whether a person of interest (POI) is identical to the missing person (MP) in a family, based on the DNA profile of the POI and available family members. As such, this application closely resembles database searching and disaster victim identification (DVI). If parents or children of the MP are available, they will typically provide sufficient statistical evidence to settle the case. However, if one must resort to more distant relatives, it is not a priori obvious that a reliable conclusion is likely to be reached. In these cases power evaluation can be highly valuable, for instance in the recruitment of additional family members. To assess the power in an identification case, we advocate the combined use of two statistics: the Probability of Exclusion, and the Probability of Exceedance. The former is the probability that the genotypes of a random, unrelated person are incompatible with the available family data. If this is close to 1, it is likely that a conclusion will be achieved regarding general relatedness, but not necessarily the specific relationship. To evaluate the ability to recognize a true match, we use simulations to estimate exceedance probabilities, i.e. the probability that the likelihood ratio will exceed a given threshold, assuming that the POI is indeed the MP. All simulations are done conditionally on available family data. Such conditional simulations have a long history in medical linkage analysis, but to our knowledge this is the first systematic forensic genetics application. Also, for forensic markers mutations cannot be ignored and therefore current models and

  17. The power and statistical behaviour of allele-sharing statistics when ...

    Indian Academy of Sciences (India)

    Unknown

    3Human Genetics Division, School of Medicine, University of Southampton, Southampton SO16 6YD, UK. Abstract ... that the statistic S-#alleles gives good performance for recessive ... (H50) of the families are linked to the single marker. The.

  18. Statistical assessment of the learning curves of health technologies.

    Science.gov (United States)

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second

  19. statistical analysis of wind speed for electrical power generation

    African Journals Online (AJOL)

    HOD

    In order to predict and model the potential of any site, ... gamma, and Raleigh distributions for 8 locations in. Nigeria. ... probability density function is used to model the average power in ... mathematical expression of the Weibull distribution is.

  20. statistical analysis of wind speed for electrical power generation

    African Journals Online (AJOL)

    HOD

    sites are suitable for the generation of electrical energy. Also, the results ... Nigerian Journal of Technology (NIJOTECH). Vol. 36, No. ... parameter in the wind-power generation system. ..... [3] A. Zaharim, A. M Razali, R. Z Abidin, and K Sopian,.

  1. Fundamentals of modern statistical methods substantially improving power and accuracy

    CERN Document Server

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  2. Enrichment of statistical power for genome-wide association studies

    Science.gov (United States)

    The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...

  3. The application of statistical methods to assess economic assets

    Directory of Open Access Journals (Sweden)

    D. V. Dianov

    2017-01-01

    Full Text Available The article is devoted to consideration and evaluation of machinery, equipment and special equipment, methodological aspects of the use of standards for assessment of buildings and structures in current prices, the valuation of residential, specialized houses, office premises, assessment and reassessment of existing and inactive military assets, the application of statistical methods to obtain the relevant cost estimates.The objective of the scientific article is to consider possible application of statistical tools in the valuation of the assets, composing the core group of elements of national wealth – the fixed assets. Firstly, capital tangible assets constitute the basis of material base of a new value creation, products and non-financial services. The gain, accumulated of tangible assets of a capital nature is a part of the gross domestic product, and from its volume and specific weight in the composition of GDP we can judge the scope of reproductive processes in the country.Based on the methodological materials of the state statistics bodies of the Russian Federation, regulations of the theory of statistics, which describe the methods of statistical analysis such as the index, average values, regression, the methodical approach is structured in the application of statistical tools to obtain value estimates of property, plant and equipment with significant accumulated depreciation. Until now, the use of statistical methodology in the practice of economic assessment of assets is only fragmentary. This applies to both Federal Legislation (Federal law № 135 «On valuation activities in the Russian Federation» dated 16.07.1998 in edition 05.07.2016 and the methodological documents and regulations of the estimated activities, in particular, the valuation activities’ standards. A particular problem is the use of a digital database of Rosstat (Federal State Statistics Service, as to the specific fixed assets the comparison should be carried

  4. In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.

    Science.gov (United States)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat

    2014-11-01

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A Statistical Method for Aggregated Wind Power Plants to Provide Secondary Frequency Control

    DEFF Research Database (Denmark)

    Hu, Junjie; Ziras, Charalampos; Bindner, Henrik W.

    2017-01-01

    curtailment for aggregated wind power plants providing secondary frequency control (SFC) to the power system. By using historical SFC signals and wind speed data, we calculate metrics for the reserve provision error as a function of the scheduled wind power. We show that wind curtailment can be significantly......The increasing penetration of wind power brings significant challenges to power system operators due to the wind’s inherent uncertainty and variability. Traditionally, power plants and more recently demand response have been used to balance the power system. However, the use of wind power...... as a balancing-power source has also been investigated, especially for wind power dominated power systems such as Denmark. The main drawback is that wind power must be curtailed by setting a lower operating point, in order to offer upward regulation. We propose a statistical approach to reduce wind power...

  6. Important statistics on engineering and construction of nuclear power plants

    International Nuclear Information System (INIS)

    Budwani, R.N.

    1976-01-01

    During the past seven years, a study was made of the engineering and craft manpower/manhour requirements, craft breakdowns by totals and peaks, material requirements, unit man-hours, rate of manhour/capital expenditures, and schedule requirements of representative nuclear power plants across the United States. The study is based on information received from electric utilities, engineer-constructors, site visits, the Nuclear Regulatory Commission (NRC), personal contacts, and the exchange of information with knowledgeable people. Preliminary data in the form of tables and figures are presented. Factors which have and will influence manpower, manhours, material requirements, building volumes, and schedules are outlined, and a list of recommendations is presented. The objective of this study has been to show in a concise fashion what the trend has been and what may be anticipated for future nuclear power plants

  7. Risk assessments ensure safer power

    Energy Technology Data Exchange (ETDEWEB)

    1982-02-19

    A growth industry is emerging devoted to the study and comparison of the economic, social and health risks posed by large industrial installations. Electricity generation is one area coming under particularly close scrutiny. Types of risk, ways of assessing risk and the difference between experts' analyses and the public perception of risk are given. An example of improved risk assessment helping to reduce deaths and injuries in coal mining is included.

  8. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    Background: Clinical research often involves study of agreement amongst observers. Agreement can be measured in different ways, and one can obtain quite different values depending on which method one uses. Objective: We review the approaches that have been discussed to assess the agreement between...... continuous measures and discuss their strengths and weaknesses. Different methods are illustrated using actual data from the `Delay in diagnosis of cancer in general practice´ project in Aarhus, Denmark. Subjects and Methods: We use weighted kappa-statistic, intraclass correlation coefficient (ICC......), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product...

  9. Wind power statistics and an evaluation of wind energy density

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, M.; Parsa, S.; Majidi, M. [Materials and Energy Research Centre, Tehran (Iran, Islamic Republic of)

    1995-11-01

    In this paper the statistical data of fifty days` wind speed measurements at the MERC- solar site are used to find out the wind energy density and other wind characteristics with the help of the Weibull probability distribution function. It is emphasized that the Weibull and Rayleigh probability functions are useful tools for wind energy density estimation but are not quite appropriate for properly fitting the actual wind data of low mean speed, short-time records. One has to use either the actual wind data (histogram) or look for a better fit by other models of the probability function. (Author)

  10. Powerful Inference with the D-Statistic on Low-Coverage Whole-Genome Data.

    Science.gov (United States)

    Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders

    2018-02-02

    The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness is assessed by evaluating specific coincidences of alleles between the groups. When working with high-throughput sequencing data, calling genotypes accurately is not always possible; therefore, the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction to combat the problems of sequencing errors, and show a way to correct for introgression from an external population that is not part of the supposed genetic relationship, and how this leads to an estimate of the admixture rate. We prove that the D-statistic is approximated by a standard normal distribution. Furthermore, we show that our method outperforms the traditional D-statistic in detecting admixtures. The power gain is most pronounced for low and medium sequencing depth (1-10×), and performances are as good as with perfectly called genotypes at a sequencing depth of 2×. We show the reliability of error correction in scenarios with simulated errors and ancient data, and correct for introgression in known scenarios to estimate the admixture rates. Copyright © 2018 Soraggi et al.

  11. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    Mills, S.E.

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  12. Reliability assessment for safety critical systems by statistical random testing

    Energy Technology Data Exchange (ETDEWEB)

    Mills, S E [Carleton Univ., Ottawa, ON (Canada). Statistical Consulting Centre

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs.

  13. Statistical Power of Psychological Research: What Have We Gained in 20 Years?

    Science.gov (United States)

    Rossi, Joseph S.

    1990-01-01

    Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…

  14. Statistical utility theory for comparison of nuclear versus fossil power plant alternatives

    International Nuclear Information System (INIS)

    Garribba, S.; Ovi, A.

    1977-01-01

    A statistical formulation of utility theory is developed for decision problems concerned with the choice among alternative strategies in electric energy production. Four alternatives are considered: nuclear power, fossil power, solar energy, and conservation policy. Attention is focused on a public electric utility thought of as a rational decision-maker. A framework for decisions is then suggested where the admissible strategies and their possible consequences represent the information available to the decision-maker. Once the objectives of the decision process are assessed, consequences can be quantified in terms of measures of effectiveness. Maximum expected utility is the criterion of choice among alternatives. Steps toward expected values are the evaluation of the multidimensional utility function and the assessment of subjective probabilities for consequences. In this respect, the multiplicative form of the utility function seems less restrictive than the additive form and almost as manageable to implement. Probabilities are expressed through subjective marginal probability density functions given at a discrete number of points. The final stage of the decision model is to establish the value of each strategy. To this scope, expected utilities are computed and scaled. The result is that nuclear power offers the best alternative. 8 figures, 9 tables, 32 references

  15. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  16. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  17. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  18. Assessing Capacity Value of Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.

    2017-04-18

    This presentation provides a high-level overview of assessing capacity value of wind power, including Impacts of multiple-year data sets, impacts of transmission assumptions, and future research needs.

  19. Statistical considerations of graphite strength for assessing design allowable stresses

    International Nuclear Information System (INIS)

    Ishihara, M.; Mogi, H.; Ioka, I.; Arai, T.; Oku, T.

    1987-01-01

    Several aspects of statistics need to be considered to determine design allowable stresses for graphite structures. These include: 1) Statistical variation of graphite material strength. 2) Uncertainty of calculated stress. 3) Reliability (survival probability) required from operational and safety performance of graphite structures. This paper deals with some statistical considerations of structural graphite for assessing design allowable stress. Firstly, probability distribution functions of tensile and compressive strengths are investigated on experimental Very High Temperature candidated graphites. Normal, logarithmic normal and Weibull distribution functions are compared in terms of coefficient of correlation to measured strength data. This leads to the adaptation of normal distribution function. Then, the relation between factor of safety and fracture probability is discussed on the following items: 1) As the graphite strength is more variable than metalic material's strength, the effect of strength variation to the fracture probability is evaluated. 2) Fracture probability depending on survival probability of 99 ∼ 99.9 (%) with confidence level of 90 ∼ 95 (%) is discussed. 3) As the material properties used in the design analysis are usually the mean values of their variation, the additional effect of these variations on the fracture probability is discussed. Finally, the way to assure the minimum ultimate strength with required survival probability with confidence level is discussed in view of statistical treatment of the strength data from varying sample numbers in a material acceptance test. (author)

  20. A statistical proposal for environmental impact assessment of development projects

    International Nuclear Information System (INIS)

    Plazas C, Julian A; De J Lema T, Alvaro; Leon P, Juan Diego

    2009-01-01

    Environmental impact assessment of development projects is a fundamental process, which main goal is to avoid that their construction and functioning, lead to serious and negative consequences on the environment. Some of the most important limitations of the models employed to assess environmental impacts, are the subjectivity of its parameters and weights, and the multicolineality among the variables, which represent high quantities of similar information. This work presents a multivariate statistical-based method that tries to diminish such limitations. For this purpose, environmental impact assessment, is valuated through different environmental impact attributes and environmental elements, synthesized in an environmental quality index (ICA in Spanish). ICA can be applied at different levels, such as at a project level, or applied only at a partial level on one or some environmental components.

  1. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  2. Safety assessment of emergency power systems for nuclear power plants

    International Nuclear Information System (INIS)

    1992-01-01

    This publication is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing the safety of a given design of the emergency power systems (EPS) for a nuclear power plant. The present publication refers closely to the NUSS Safety Guide 50-SG-D7 (Rev. 1), Emergency Power Systems at Nuclear Power Plants. It covers therefore exactly the same technical subject as that Safety Guide. In view of its objective, however, it attempts to help in the evaluation of possible technical solutions which are intended to fulfill the safety requirements. Section 2 clarifies the scope further by giving an outline of the assessment steps in the licensing process. After a general outline of the assessment process in relation to the licensing of a nuclear power plant, the publication is divided into two parts. First, all safety issues are presented in the form of questions that have to be answered in order for the assessor to be confident of a safe design. The second part presents the same topics in tabulated form, listing the required documentation which the assessor has to consult and those international and national technical standards pertinent to the topics. An extensive reference list provides information on standards. 1 tab

  3. Statistical models for thermal ageing of steel materials in nuclear power plants

    International Nuclear Information System (INIS)

    Persoz, M.

    1996-01-01

    Some category of steel materials in nuclear power plants may be subjected to thermal ageing, whose extent depends on the steel chemical composition and the ageing parameters, i.e. temperature and duration. This ageing affects the 'impact strength' of the materials, which is a mechanical property. In order to assess the residual lifetime of these components, a probabilistic study has been launched, which takes into account the scatter over the input parameters of the mechanical model. Predictive formulae for estimating the impact strength of aged materials are important input data of the model. A data base has been created with impact strength results obtained from an ageing program in laboratory and statistical treatments have been undertaken. Two kinds of model have been developed, with non linear regression methods (PROC NLIN, available in SAS/STAT). The first one, using a hyperbolic tangent function, is partly based on physical considerations, and the second one, of an exponential type, is purely statistically built. The difficulties consist in selecting the significant parameters and attributing initial values to the coefficients, which is a requirement of the NLIN procedure. This global statistical analysis has led to general models that are unction of the chemical variables and the ageing parameters. These models are as precise (if not more) as local models that had been developed earlier for some specific values of ageing temperature and ageing duration. This paper describes the data and the methodology used to build the models and analyses the results given by the SAS system. (author)

  4. GWAPower: a statistical power calculation software for genome-wide association studies with quantitative traits.

    Science.gov (United States)

    Feng, Sheng; Wang, Shengchu; Chen, Chia-Cheng; Lan, Lan

    2011-01-21

    In designing genome-wide association (GWA) studies it is important to calculate statistical power. General statistical power calculation procedures for quantitative measures often require information concerning summary statistics of distributions such as mean and variance. However, with genetic studies, the effect size of quantitative traits is traditionally expressed as heritability, a quantity defined as the amount of phenotypic variation in the population that can be ascribed to the genetic variants among individuals. Heritability is hard to transform into summary statistics. Therefore, general power calculation procedures cannot be used directly in GWA studies. The development of appropriate statistical methods and a user-friendly software package to address this problem would be welcomed. This paper presents GWAPower, a statistical software package of power calculation designed for GWA studies with quantitative traits, where genetic effect is defined as heritability. Based on several popular one-degree-of-freedom genetic models, this method avoids the need to specify the non-centrality parameter of the F-distribution under the alternative hypothesis. Therefore, it can use heritability information directly without approximation. In GWAPower, the power calculation can be easily adjusted for adding covariates and linkage disequilibrium information. An example is provided to illustrate GWAPower, followed by discussions. GWAPower is a user-friendly free software package for calculating statistical power based on heritability in GWA studies with quantitative traits. The software is freely available at: http://dl.dropbox.com/u/10502931/GWAPower.zip.

  5. Combining heuristic and statistical techniques in landslide hazard assessments

    Science.gov (United States)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  6. Statistical modelling of space-time processes with application to wind power

    DEFF Research Database (Denmark)

    Lenzi, Amanda

    . This thesis aims at contributing to the wind power literature by building and evaluating new statistical techniques for producing forecasts at multiple locations and lead times using spatio-temporal information. By exploring the features of a rich portfolio of wind farms in western Denmark, we investigate...... propose spatial models for predicting wind power generation at two different time scales: for annual average wind power generation and for a high temporal resolution (typically wind power averages over 15-min time steps). In both cases, we use a spatial hierarchical statistical model in which spatial...

  7. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    Directory of Open Access Journals (Sweden)

    Chaeyoung Lee

    2012-11-01

    Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.

  8. On the statistical assessment of classifiers using DNA microarray data

    Directory of Open Access Journals (Sweden)

    Carella M

    2006-08-01

    Full Text Available Abstract Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22 and tumor (25 specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045 as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS and Support Vector Machines (SVM classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035 and e = 18% (p = 0.037 respectively. Moreover, the error rate

  9. Nuclear power plant training simulator fidelity assessment

    International Nuclear Information System (INIS)

    Carter, R.J.; Laughery, K.R.

    1985-01-01

    The fidelity assessment portion of a methodology for evaluating nuclear power plant simulation facilities in regard to their appropriateness for conducting the Nuclear Regulatory Commission's operating test was described. The need for fidelity assessment, data sources, and fidelity data to be collected are addressed. Fidelity data recording, collection, and analysis are discussed. The processes for drawing conclusions from the fidelity assessment and evaluating the adequacy of the simulator control-room layout were presented. 3 refs

  10. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  11. Low statistical power in biomedical science: a review of three human research domains

    Science.gov (United States)

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409

  12. Statistical assessment of quality of credit activity of Ukrainian banks

    Directory of Open Access Journals (Sweden)

    Moldavska Olena V.

    2013-03-01

    Full Text Available The article conducts an economic and statistical analysis of the modern state of credit activity of Ukrainian banks and main tendencies of its development. It justifies urgency of the statistical study of credit activity of banks. It offers a complex system of assessment of bank lending at two levels: the level of the banking system and the level of an individual bank. The use of the system analysis allows reflection of interconnection between effectiveness of functioning of the banking system and quality of the credit portfolio. The article considers main aspects of management of quality of the credit portfolio – level of troubled debt and credit risk. The article touches the problem of adequate quantitative assessment of troubled loans in the credit portfolios of banks, since the methodologies of its calculation used by the National Bank of Ukraine and international rating agencies are quite different. The article presents a system of methods of management of credit risk, both theoretically and providing specific examples, in the context of prevention of occurrence of risk situations or elimination of their consequences.

  13. Intelligent Techniques for Power Systems Vulnerability Assessment

    OpenAIRE

    Mohamed A. El-Sharkawi

    2002-01-01

    With power grids considered national security matters, the reliable operation of the system is of top priority to utilities.  This concern is amplified by the utility’s deregulation, which increases the system’s openness while simultaneously decreasing the applied degree of control.  Vulnerability Assessment (VA) deals with the power system’s ability to continue to provide service in case of an unforeseen catastrophic contingency.  Such contingencies may include unauthorized tripping, breaks ...

  14. SIESE - trimestrial bulletin - Synthesis 1995. Electric power summary statistics for Brazil

    International Nuclear Information System (INIS)

    1995-01-01

    This bulletin presents the electric power summary statistics, which cover the performance of the power system for the whole of the utilities in 1995. It offers tables with revised data concerning the last two years based on updated information supplied by both the electric utilities and the SIESE's responsibility centers. 6 figs., 36 tabs

  15. Statistical annual report 2003 of Furnas - Electrical Power plants and Co., RJ, Brazil. Calendar year 2003

    International Nuclear Information System (INIS)

    2003-01-01

    This document presents the statistical annual report of Furnas Power Plants and Co, reporting the results obtained during the calendar year of 2003 and the evolution in the last five years, allowing a general and comparative views of the company performance focusing the power generation and transmission, economic and financial results

  16. Ten-year statistics of the electric power supply. Status and tendencies

    International Nuclear Information System (INIS)

    2001-12-01

    The ten-year statistics of the electric power supply in Denmark for 1991-2000 presents in tables and figures the trend of the electric power supply sector during the last ten years. The tables and figures present information on total energy consumption, combined heat and power generation, fuel consumption and the environment, the technical systems, economy and pricing, organization of the electricity supply, and information on electricity prices and taxes for households and industry in various countries. (LN)

  17. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  18. Ground assessment methods for nuclear power plant

    International Nuclear Information System (INIS)

    1985-01-01

    It is needless to say that nuclear power plant must be constructed on the most stable and safe ground. Reliable assessment method is required for the purpose. The Ground Integrity Sub-committee of the Committee of Civil Engineering of Nuclear Power Plant started five working groups, the purpose of which is to systematize the assessment procedures including geological survey, ground examination and construction design. The works of working groups are to establishing assessment method of activities of faults, standardizing the rock classification method, standardizing assessment and indication method of ground properties, standardizing test methods and establishing the application standard for design and construction. Flow diagrams for the procedures of geological survey, for the investigation on fault activities and ground properties of area where nuclear reactor and important outdoor equipments are scheduled to construct, were established. And further, flow diagrams for applying investigated results to design and construction of plant, and for determining procedure of liquidification nature of ground etc. were also established. These systematized and standardized methods of investigation are expected to yield reliable data for assessment of construction site of nuclear power plant and lead to the safety of construction and operation in the future. In addition, the execution of these systematized and detailed preliminary investigation for determining the construction site of nuclear power plant will make much contribution for obtaining nation-wide understanding and faith for the project. (Ishimitsu, A.)

  19. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  20. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. Millstone nuclear power plant emergency system assessment

    International Nuclear Information System (INIS)

    Akhmad Khusyairi

    2011-01-01

    U.S.NRC determined an obligation to build a nuclear power plant emergency response organization for both on-site and off-site. Millstone Nuclear Power Plants have 3 nuclear reactors and 2 of 3 still in commercial operation. Reactor unit 1, BWR type has been permanently shut down in 1998, while the two others, units 2 and 3 obtain the extended operating license respectively until 2035 and 2045. As a nuclear installation has the high potential radiological impact, Millstone nuclear power plant emergency response organization must establish both on-site or off-site. Emergency response organization that is formed must involve several state agencies, both state agencies and municipality. They have specific duties and functions in a state of emergency, so that protective measures can be undertaken in accordance with the community that has been planned. Meanwhile, NRC conduct their own independent assessment of nuclear power plant emergencies. (author)

  4. Monte Carlo based statistical power analysis for mediation models: methods and software.

    Science.gov (United States)

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  5. Simulations and cosmological inference: A statistical model for power spectra means and covariances

    International Nuclear Information System (INIS)

    Schneider, Michael D.; Knox, Lloyd; Habib, Salman; Heitmann, Katrin; Higdon, David; Nakhleh, Charles

    2008-01-01

    We describe an approximate statistical model for the sample variance distribution of the nonlinear matter power spectrum that can be calibrated from limited numbers of simulations. Our model retains the common assumption of a multivariate normal distribution for the power spectrum band powers but takes full account of the (parameter-dependent) power spectrum covariance. The model is calibrated using an extension of the framework in Habib et al. (2007) to train Gaussian processes for the power spectrum mean and covariance given a set of simulation runs over a hypercube in parameter space. We demonstrate the performance of this machinery by estimating the parameters of a power-law model for the power spectrum. Within this framework, our calibrated sample variance distribution is robust to errors in the estimated covariance and shows rapid convergence of the posterior parameter constraints with the number of training simulations.

  6. Statistical reliability assessment of UT round-robin test data for piping welds

    International Nuclear Information System (INIS)

    Kim, H.M.; Park, I.K.; Park, U.S.; Park, Y.W.; Kang, S.C.; Lee, J.H.

    2004-01-01

    Ultrasonic NDE is one of important technologies in the life-time maintenance of nuclear power plant. Ultrasonic inspection system is consisted of the operator, equipment and procedure. The reliability of ultrasonic inspection system is affected by its ability. The performance demonstration round robin was conducted to quantify the capability of ultrasonic inspection for in-service. Several teams employed procedures that met or exceeded with ASME sec. XI code requirements detected the piping of nuclear power plant with various cracks to evaluate the capability of detection and sizing. In this paper, the statistical reliability assessment of ultrasonic nondestructive inspection data using probability of detection (POD) is presented. The result of POD using logistic model was useful to the reliability assessment for the NDE hit or miss data. (orig.)

  7. Statistical power as a function of Cronbach alpha of instrument questionnaire items.

    Science.gov (United States)

    Heo, Moonseong; Kim, Namhee; Faith, Myles S

    2015-10-14

    In countless number of clinical trials, measurements of outcomes rely on instrument questionnaire items which however often suffer measurement error problems which in turn affect statistical power of study designs. The Cronbach alpha or coefficient alpha, here denoted by C(α), can be used as a measure of internal consistency of parallel instrument items that are developed to measure a target unidimensional outcome construct. Scale score for the target construct is often represented by the sum of the item scores. However, power functions based on C(α) have been lacking for various study designs. We formulate a statistical model for parallel items to derive power functions as a function of C(α) under several study designs. To this end, we assume fixed true score variance assumption as opposed to usual fixed total variance assumption. That assumption is critical and practically relevant to show that smaller measurement errors are inversely associated with higher inter-item correlations, and thus that greater C(α) is associated with greater statistical power. We compare the derived theoretical statistical power with empirical power obtained through Monte Carlo simulations for the following comparisons: one-sample comparison of pre- and post-treatment mean differences, two-sample comparison of pre-post mean differences between groups, and two-sample comparison of mean differences between groups. It is shown that C(α) is the same as a test-retest correlation of the scale scores of parallel items, which enables testing significance of C(α). Closed-form power functions and samples size determination formulas are derived in terms of C(α), for all of the aforementioned comparisons. Power functions are shown to be an increasing function of C(α), regardless of comparison of interest. The derived power functions are well validated by simulation studies that show that the magnitudes of theoretical power are virtually identical to those of the empirical power. Regardless

  8. A statistical estimator for the boiler power and its related parameters

    International Nuclear Information System (INIS)

    Tang, H.

    2001-01-01

    To determine the boiler power accurately is important for both controlling the plant and maximizing the plant productivity. There are two computed boiler powers for each boiler. They are steam based boiler power and feedwater based boiler power. The steam based boiler power is computed as the enthalpy difference between the feedwater enthalpy and the boiler steam enthalpy. The feedwater based boiler power is computed as enthalpy absorbed by the feedwater. The steam based boiler power is computed in RRS program and used in calibrating the measured reactor power, while the feedwater based boiler power is computed in CSTAT program and used for indication. Since the steam based boiler power is used as feedback in the reactor control, it is chosen to be the one estimated in this work. Because the boiler power employs steam flow, feedwater flow and feedwater temperature measurements, and because any measurement contains constant or drifting noise and bias, the reconciliation and rectification procedures are needed to determine the boiler power more accurately. A statistic estimator is developed to perform the function of data reconciliation, gross error detection and instruments performance monitoring

  9. Statistical algorithm for automated signature analysis of power spectral density data

    International Nuclear Information System (INIS)

    Piety, K.R.

    1977-01-01

    A statistical algorithm has been developed and implemented on a minicomputer system for on-line, surveillance applications. Power spectral density (PSD) measurements on process signals are the performance signatures that characterize the ''health'' of the monitored equipment. Statistical methods provide a quantitative basis for automating the detection of anomalous conditions. The surveillance algorithm has been tested on signals from neutron sensors, proximeter probes, and accelerometers to determine its potential for monitoring nuclear reactors and rotating machinery

  10. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  11. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  12. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  13. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2012-01-01

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  14. Sunspot activity and influenza pandemics: a statistical assessment of the purported association.

    Science.gov (United States)

    Towers, S

    2017-10-01

    Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.

  15. Intelligent Techniques for Power Systems Vulnerability Assessment

    Directory of Open Access Journals (Sweden)

    Mohamed A. El-Sharkawi

    2002-06-01

    Full Text Available With power grids considered national security matters, the reliable operation of the system is of top priority to utilities.  This concern is amplified by the utility’s deregulation, which increases the system’s openness while simultaneously decreasing the applied degree of control.  Vulnerability Assessment (VA deals with the power system’s ability to continue to provide service in case of an unforeseen catastrophic contingency.  Such contingencies may include unauthorized tripping, breaks in communication links, sabotage or intrusion by external agents, human errors, natural calamities and faults.  These contingencies could lead to a disruption of service to part or all of the system.  The service disruption is known as outage or blackout.  The paper outlines an approach by which feature extraction and boundary tracking can be implemented to achieve on line vulnerability assessment.

  16. The relation between statistical power and inference in fMRI.

    Directory of Open Access Journals (Sweden)

    Henk R Cremers

    Full Text Available Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects, and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial-especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20-30 display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate prediction methods and meta-analyses with related synthesis-oriented approaches.

  17. Cyber security assessment of a power plant

    Energy Technology Data Exchange (ETDEWEB)

    Nai Fovino, Igor; Masera, Marcelo; Stefanini, Alberto [Joint Research Centre, Institute for the Protection and Security of the Citizen, Ispra (Italy); Guidi, Luca [Enel Ingegneria e Innovazione, Pisa (Italy)

    2011-02-15

    Critical infrastructures and systems are today exposed not only to traditional safety and availability problems, but also to new kinds of security threats. These are mainly due to the large number of new vulnerabilities and architectural weaknesses introduced by the extensive use of information and communication technologies (ICT) into such complex systems. In this paper we present the outcomes of an exhaustive ICT security assessment, targeting an operational power plant, which consisted also of the simulation of potential cyber attacks. The assessment shows that the plant is considerably vulnerable to malicious attacks. This situation cannot be ignored, because the potential outcomes of an induced plant malfunction can be severe. (author)

  18. The case for increasing the statistical power of eddy covariance ecosystem studies: why, where and how?

    Science.gov (United States)

    Hill, Timothy; Chocholek, Melanie; Clement, Robert

    2017-06-01

    Eddy covariance (EC) continues to provide invaluable insights into the dynamics of Earth's surface processes. However, despite its many strengths, spatial replication of EC at the ecosystem scale is rare. High equipment costs are likely to be partially responsible. This contributes to the low sampling, and even lower replication, of ecoregions in Africa, Oceania (excluding Australia) and South America. The level of replication matters as it directly affects statistical power. While the ergodicity of turbulence and temporal replication allow an EC tower to provide statistically robust flux estimates for its footprint, these principles do not extend to larger ecosystem scales. Despite the challenge of spatially replicating EC, it is clearly of interest to be able to use EC to provide statistically robust flux estimates for larger areas. We ask: How much spatial replication of EC is required for statistical confidence in our flux estimates of an ecosystem? We provide the reader with tools to estimate the number of EC towers needed to achieve a given statistical power. We show that for a typical ecosystem, around four EC towers are needed to have 95% statistical confidence that the annual flux of an ecosystem is nonzero. Furthermore, if the true flux is small relative to instrument noise and spatial variability, the number of towers needed can rise dramatically. We discuss approaches for improving statistical power and describe one solution: an inexpensive EC system that could help by making spatial replication more affordable. However, we note that diverting limited resources from other key measurements in order to allow spatial replication may not be optimal, and a balance needs to be struck. While individual EC towers are well suited to providing fluxes from the flux footprint, we emphasize that spatial replication is essential for statistically robust fluxes if a wider ecosystem is being studied. © 2016 The Authors Global Change Biology Published by John Wiley

  19. Effect size and statistical power in the rodent fear conditioning literature - A systematic review.

    Science.gov (United States)

    Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.

  20. Effect size and statistical power in the rodent fear conditioning literature – A systematic review

    Science.gov (United States)

    Macleod, Malcolm R.

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451

  1. Ten-year statistics of the electric power supply. Status and tendencies

    International Nuclear Information System (INIS)

    2000-12-01

    The ten-year statistics of the electric power supply in Denmark for 1990-1999 presents in tables and figures the trend of the electric power supply sector during the last ten years. The tables and figures present information on total energy consumption, combined heat and power generation, fuel consumption and the environment, the technical systems, economy and pricing, organization of the electricity supply, auto-production of electricity and information on electricity prices and taxes for households and industry in various countries. (LN)

  2. A testing procedure for wind turbine generators based on the power grid statistical model

    DEFF Research Database (Denmark)

    Farajzadehbibalan, Saber; Ramezani, Mohammad Hossein; Nielsen, Peter

    2017-01-01

    In this study, a comprehensive test procedure is developed to test wind turbine generators with a hardware-in-loop setup. The procedure employs the statistical model of the power grid considering the restrictions of the test facility and system dynamics. Given the model in the latent space...

  3. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    Science.gov (United States)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  4. The MAX Statistic is Less Powerful for Genome Wide Association Studies Under Most Alternative Hypotheses.

    Science.gov (United States)

    Shifflett, Benjamin; Huang, Rong; Edland, Steven D

    2017-01-01

    Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.

  5. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    Science.gov (United States)

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  6. A multivariate statistical study on a diversified data gathering system for nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.; Levine, M.M.; Kato, W.Y.

    1989-02-01

    In this report, multivariate statistical methods are presented and applied to demonstrate their use in analyzing nuclear power plant operational data. For analyses of nuclear power plant events, approaches are presented for detecting malfunctions and degradations within the course of the event. At the system level, approaches are investigated as a means of diagnosis of system level performance. This involves the detection of deviations from normal performance of the system. The input data analyzed are the measurable physical parameters, such as steam generator level, pressurizer water level, auxiliary feedwater flow, etc. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients and computer simulation of a plant system performance (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to explore statistically the detection of failure trends and patterns and prevention of conditions with serious safety implications. 33 refs., 18 figs., 9 tabs

  7. Statistical analysis of human maintenance failures of a nuclear power plant

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    In this paper, a statistical study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components are especially prone to human failures. Many human failures were found in safety related systems. Similarly, several failures remained latent from outages to power operation. The safety significance was generally small. Modifications are an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more, in future. (orig.)

  8. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    Science.gov (United States)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  9. Fatigue assessments in operating nuclear power plants

    International Nuclear Information System (INIS)

    Gosselin, S.R.; Deardorff, A.F.; Peltola, D.W.

    1994-01-01

    In November 1991, the ASME Section XI Task Group on Operating Plant Fatigue Assessment was formed to develop criteria and evaluation methodology for evaluating the effects of cyclic operation in operating nuclear power plants. The objective was to develop guidelines for inclusion in Section XI that could be used by plant operators in evaluating fatigue concerns and their impact on serviceability. This paper discusses the work performed by the Task Group. It explores the concept of ''Fatigue Design Basis'' versus ''Fatigue Operating Basis'' by examining the roles of ASME Section III and ASME Section XI in the design and operation of the nuclear power plants. Guidelines are summarized that may help plant operators perform effective design transient cycle evaluations and optimize cycle counting and fatigue usage tracking. The alternative fatigue evaluation approach using flaw tolerance is also introduced

  10. Validation of statistical assessment method for the optimization of the inspection need for nuclear steam generators

    International Nuclear Information System (INIS)

    Wallin, K.; Voskamp, R.; Schmibauer, J.; Ostermeyer, H.; Nagel, G.

    2011-01-01

    The cost of steam generator inspections in nuclear power plants is high. A new quantitative assessment methodology for the accumulation of flaws due to stochastic causes like fretting has been developed for cases where limited inspection data is available. Additionally, a new quantitative assessment methodology for the accumulation of environment related flaws, caused e.g. by corrosion in steam generator tubes, has been developed. The method that combines deterministic information regarding flaw initiation and growth with stochastic elements connected to environmental aspects requires only knowledge of the experimental flaw accumulation history. The method, combining both types of flaw types, provides a complete description of the flaw accumulation and there are several possible uses of the method. The method can be used to evaluate the total life expectancy of the steam generator and simple statistically defined plugging criteria can be established based on flaw behaviour. This way the inspection interval and inspection coverage can be optimized with respect to allowable flaws and the method can recognize flaw type subsets requiring more frequent inspection intervals. The method can also be used to develop statistically realistic safety factors accounting for uncertainties in inspection flaw sizing and detection. The statistical assessment method has been showed to be robust and insensitive to different assessments of plugged tubes. Because the procedure is re-calibrated after each inspection, it reacts effectively to possible changes in the steam generator environment. Validation of the assessment method is provided for real steam generators, both in the case of stochastic damage as well as environment related flaws. (authors)

  11. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  12. Criteria for the assessment of fusion power

    International Nuclear Information System (INIS)

    Sweet, Colin.

    1989-01-01

    Fusion power requires an exceptionally long development time and its future depends on the changing perspectives society uses to evaluate resources in the long term. For 40 years fusion technology developed within a decision making context dominated by technical-political interests, and characterized by a bias towards overoptimism about the future. That is now changing. This article contends that we are still a long way from making rational assessments of large technological projects. However, feasibility for fusion will have to be tested by social criteria at least as important as those used for scientific feasibility. (author)

  13. Science assessment of fusion power plant

    International Nuclear Information System (INIS)

    Nagai, Toru; Shimazu, Yasuo

    1984-01-01

    A concept of SCIENCE ASSESSMENT (SA) is proposed to support a research program of the so-called big science. The SA System should be established before the demonstration reactor is realized, and the system is classified into four categories: (1) Resource Economy Assessment (REA) (cost evaluation and availability of rare resource materials), (2) Risk Assessment (RA) (structural safety during operation and accident), (3) Environmental Assessment (EA) (adaptability to environments), and (4) Socio-Political Assessment (SPA) (from local public acceptance to national policy acceptance). Here, REA to the published conceptual designs of commercial fusion power plants (most of them are TOKAMAK) is carried out as the first step. The energy analysis method is imployed because the final goal of fusion plant is to supply energy. The evaluation index is the energy ratio (= output/input). Computer code for energy analysis was developed, to which the material inventory table from the conceptual design and the database for the energy intensity (= energy required to obtain a unit amount of materials) were prepared. (Nogami, K.)

  14. Statistical power of model selection strategies for genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Zheyang Wu

    2009-07-01

    Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the

  15. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...

  16. A Note on Comparing the Power of Test Statistics at Low Significance Levels.

    Science.gov (United States)

    Morris, Nathan; Elston, Robert

    2011-01-01

    It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.

  17. CAPABILITY ASSESSMENT OF MEASURING EQUIPMENT USING STATISTIC METHOD

    Directory of Open Access Journals (Sweden)

    Pavel POLÁK

    2014-10-01

    Full Text Available Capability assessment of the measurement device is one of the methods of process quality control. Only in case the measurement device is capable, the capability of the measurement and consequently production process can be assessed. This paper deals with assessment of the capability of the measuring device using indices Cg and Cgk.

  18. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  19. Environmental assessment of submarine power cables

    Energy Technology Data Exchange (ETDEWEB)

    Isus, Daniel; Martinez, Juan D. [Grupo General Cable Sistemas, S.A., 08560-Manlleu, Barcelona (Spain); Arteche, Amaya; Del Rio, Carmen; Madina, Virginia [Tecnalia Research and Innovation, 20009 San Sebastian (Spain)

    2011-03-15

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  20. Environmental assessment of submarine power cables

    International Nuclear Information System (INIS)

    Isus, Daniel; Martinez, Juan D.; Arteche, Amaya; Del Rio, Carmen; Madina, Virginia

    2011-03-01

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  1. Nordel - Availability statistics for thermal power plants 1995. (Denmark, Finland, Sweden)

    International Nuclear Information System (INIS)

    1996-01-01

    The power companies of Denmark, Finland and Sweden have agreed on almost identical procedures for the recording and analysing of data describing the availability of power producing units over a certain capacity. Since 1975 the data for all three countries have been summarized and published in a joint report. The purpose of this report is to present some basic information about the operation of power producing units in the three countries. Referring to the report, companies or bodies will be able to exchange more detailed information with other companies or bodies in any of the countries. The report includes power producing units using fossil fuels, nuclear power plants and gas turbines. The information is presented separately for each country with a joint NORDEL statistics for units using fossil fuels, arranged in separate groups according to the type of fossil fuel which is used. The grouping of power producing units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. The definitions in NORDEL's 'Tillgaenglighetsbegrepp foer vaermekraft' ('The Concept of Availability for Thermal Power'), September 1977, are used in this report. The basic data for the availability are in accordance with the recommendations of UNIPEDE/WEC. (author)

  2. Simulating European wind power generation applying statistical downscaling to reanalysis data

    International Nuclear Information System (INIS)

    González-Aparicio, I.; Monforti, F.; Volker, P.; Zucker, A.; Careri, F.; Huld, T.; Badger, J.

    2017-01-01

    Highlights: •Wind speed spatial resolution highly influences calculated wind power peaks and ramps. •Reduction of wind power generation uncertainties using statistical downscaling. •Publicly available dataset of wind power generation hourly time series at NUTS2. -- Abstract: The growing share of electricity production from solar and mainly wind resources constantly increases the stochastic nature of the power system. Modelling the high share of renewable energy sources – and in particular wind power – crucially depends on the adequate representation of the intermittency and characteristics of the wind resource which is related to the accuracy of the approach in converting wind speed data into power values. One of the main factors contributing to the uncertainty in these conversion methods is the selection of the spatial resolution. Although numerical weather prediction models can simulate wind speeds at higher spatial resolution (up to 1 × 1 km) than a reanalysis (generally, ranging from about 25 km to 70 km), they require high computational resources and massive storage systems: therefore, the most common alternative is to use the reanalysis data. However, local wind features could not be captured by the use of a reanalysis technique and could be translated into misinterpretations of the wind power peaks, ramping capacities, the behaviour of power prices, as well as bidding strategies for the electricity market. This study contributes to the understanding what is captured by different wind speeds spatial resolution datasets, the importance of using high resolution data for the conversion into power and the implications in power system analyses. It is proposed a methodology to increase the spatial resolution from a reanalysis. This study presents an open access renewable generation time series dataset for the EU-28 and neighbouring countries at hourly intervals and at different geographical aggregation levels (country, bidding zone and administrative

  3. Production-distribution of electric power in France: 1997-98 statistical data

    International Nuclear Information System (INIS)

    1999-01-01

    This document has been realized using the annual inquiry carried out by the French direction of gas, electricity and coal (Digec). It brings together the main statistical data about the production, transport and consumption of electric power in France: 1997 and 1998 balance sheets, foreign exchanges, long-term evolutions, production with respect to the different energy sources, consumption in the different departments and regions.. (J.S.)

  4. A Statistical Approach to Planning Reserved Electric Power for Railway Infrastructure Administration

    OpenAIRE

    Brabec, M. (Marek); Pelikán, E. (Emil); Konár, O. (Ondřej); Kasanický, I. (Ivan); Juruš, P. (Pavel); Sadil, J.; Blažek, P.

    2013-01-01

    One of the requirements on railway infrastructure administration is to provide electricity for day-to-day operation of railways. We propose a statistically based approach for the estimation of maximum 15-minute power within a calendar month for a given region. This quantity serves as a basis of contracts between railway infrastructure administration and electricity distribution system operator. We show that optimization of the prediction is possible, based on underlying loss function deriv...

  5. Statistics for products of traces of high powers of the frobenius class of hyperelliptic curves

    OpenAIRE

    Roditty-Gershon, Edva

    2011-01-01

    We study the averages of products of traces of high powers of the Frobenius class of hyperelliptic curves of genus g over a fixed finite field. We show that for increasing genus g, the limiting expectation of these products equals to the expectation when the curve varies over the unitary symplectic group USp(2g). We also consider the scaling limit of linear statistics for eigenphases of the Frobenius class of hyperelliptic curves, and show that their first few moments are Gaussian.

  6. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    An, Sang Ha

    2006-02-01

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  7. Statistical measurement of power spectrum density of large aperture optical component

    International Nuclear Information System (INIS)

    Xu Jiancheng; Xu Qiao; Chai Liqun

    2010-01-01

    According to the requirement of ICF, a method based on statistical theory has been proposed to measure the power spectrum density (PSD) of large aperture optical components. The method breaks the large-aperture wavefront into small regions, and obtains the PSD of the large-aperture wavefront by weighted averaging of the PSDs of the regions, where the weight factor is each region's area. Simulation and experiment demonstrate the effectiveness of the proposed method. They also show that, the obtained PSDs of the large-aperture wavefront by statistical method and sub-aperture stitching method fit well, when the number of small regions is no less than 8 x 8. The statistical method is not sensitive to translation stage's errors and environment instabilities, thus it is appropriate for PSD measurement during the process of optical fabrication. (authors)

  8. Application of extended statistical combination of uncertainties methodology for digital nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.

  9. Powerful Inference With the D-Statistic on Low-Coverage Whole-Genome Data

    DEFF Research Database (Denmark)

    Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders

    2018-01-01

    The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness...... is assessed by evaluating specific coincidences of alleles between the groups. When working with high throughput sequencing data calling genotypes accurately is not always possible, therefore the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring...... much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction...

  10. Statistical and RBF NN models : providing forecasts and risk assessment

    OpenAIRE

    Marček, Milan

    2009-01-01

    Forecast accuracy of economic and financial processes is a popular measure for quantifying the risk in decision making. In this paper, we develop forecasting models based on statistical (stochastic) methods, sometimes called hard computing, and on a soft method using granular computing. We consider the accuracy of forecasting models as a measure for risk evaluation. It is found that the risk estimation process based on soft methods is simplified and less critical to the question w...

  11. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.

  12. Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance

    Science.gov (United States)

    Colombo, Matteo; Bucher, Leandra; Sprenger, Jan

    2017-01-01

    Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i) the prior credibility of an explanatory hypothesis, (ii) the causal framing of the hypothesis, (iii) the perceived generalizability of the explanation, and (iv) the relation of statistical relevance between hypothesis and evidence. Collectively, the results of our five experiments support the hypothesis that the prior credibility of a causal explanation plays a central role in explanatory reasoning: first, because of the presence of strong main effects on judgments of explanatory power, and second, because of the gate-keeping role it has for other factors. Highly credible explanations are not susceptible to causal framing effects, but they are sensitive to the effects of normatively relevant factors: the generalizability of an explanation, and its statistical relevance for the evidence. These results advance current literature in the philosophy and psychology of explanation in three ways. First, they yield a more nuanced understanding of the determinants of judgments of explanatory power, and the interaction between these factors. Second, they show the close relationship between prior beliefs and explanatory power. Third, they elucidate the nature of abductive reasoning. PMID:28928679

  13. Statistical modeling of an integrated boiler for coal fired thermal power plant

    Directory of Open Access Journals (Sweden)

    Sreepradha Chandrasekharan

    2017-06-01

    Full Text Available The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R2 analysis and ANOVA (Analysis of Variance. The dependability of the process variable (temperature on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM supported by DOE (design of experiments are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant. Keywords: Chemical engineering, Applied mathematics

  14. Statistical modeling of an integrated boiler for coal fired thermal power plant.

    Science.gov (United States)

    Chandrasekharan, Sreepradha; Panda, Rames Chandra; Swaminathan, Bhuvaneswari Natrajan

    2017-06-01

    The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R 2 analysis and ANOVA (Analysis of Variance). The dependability of the process variable (temperature) on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM) supported by DOE (design of experiments) are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant.

  15. Statistical modeling of the power grid from a wind farm standpoint

    DEFF Research Database (Denmark)

    Farajzadehbibalan, Saber; Ramezani, Mohammad H.; Nielsen, Peter

    2017-01-01

    wind farm over several years which results in the development of a useful model for practical purposes. Secondly, the derived model is computationally inexpensive. Considering an arbitrary wind turbine generator, we show that the behavior of the power grid at the connection point can be represented......In this study, we derive a statistical model of a power grid from the wind farm's standpoint based on dynamic principal component analysis. The main advantages of our model compared to the previously developed models are twofold. Firstly, our proposed model benefits from logged data of an offshore...... by 4 out of 9 registered variables, i.e. 3-phase voltages, 3-phase currents, frequency, and generated active and reactive powers. We further prove that the dynamic nature of the system can be optimally captured by a time lag shift of two samples. To extend the derived model of a wind turbine generator...

  16. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  17. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    International Nuclear Information System (INIS)

    Sibatov, R T

    2011-01-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  18. Waste Heat to Power Market Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Elson, Amelia [ICF International, Fairfax, VA (United States); Tidball, Rick [ICF International, Fairfax, VA (United States); Hampson, Anne [ICF International, Fairfax, VA (United States)

    2015-03-01

    Waste heat to power (WHP) is the process of capturing heat discarded by an existing process and using that heat to generate electricity. In the industrial sector, waste heat streams are generated by kilns, furnaces, ovens, turbines, engines, and other equipment. In addition to processes at industrial plants, waste heat streams suitable for WHP are generated at field locations, including landfills, compressor stations, and mining sites. Waste heat streams are also produced in the residential and commercial sectors, but compared to industrial sites these waste heat streams typically have lower temperatures and much lower volumetric flow rates. The economic feasibility for WHP declines as the temperature and flow rate decline, and most WHP technologies are therefore applied in industrial markets where waste heat stream characteristics are more favorable. This report provides an assessment of the potential market for WHP in the industrial sector in the United States.

  19. A statistical assessment of zero-polarization catalogues

    Science.gov (United States)

    Clarke, D.; Naghizadeh-Khouei, J.; Simmons, J. F. L.; Stewart, B. G.

    1993-03-01

    The statistical behavior associated with polarization measurements is presented. The cumulative distribution function for measurements of unpolarized sources normalized by the measurement error is considered and Kolmogorov tests have been applied to data which might be considered as being representative of assemblies of unpolarized stars. Tinbergen's (1979, 1982) and Piirola's I (1977) catalogs have been examined and reveal shortcomings, the former indicating the presence of uncorrected instrumental polarization in part of the data and both suggesting that the quoted errors are in general slightly underestimated. Citings of these catalogs as providing evidence that middle-type stars in general exhibit weak intrinsic polarizations are shown to be invalid.

  20. Can statistical data qualify assessments of children at risk?

    DEFF Research Database (Denmark)

    Søbjerg, Lene Mosegaard; Villumsen, Anne Marie Anker; Klitbjerg-Nielsen, Christina

    and parents that are already registered in the municipality such as home address and school records. A similar tool is being developed in a social work research project in Denmark. The idea is to include risk and protection factors such as information about health, school absenteeism and family circumstances......Every day municipalities across Europe (and beyond) receive notifications about children at risk. The notifications come from teachers, health professionals, social workers, neighbors, or anyone else who sees a child, which appears not to thrive. The assessment and validation of whether the child...... significantly from case to case. Third, the relative importance of the different risk and protection factors is complex and difficult to assess – especially when the social worker has to assess both immediate danger as well as risk of long term failure-to-thrive. Internationally, different risk assessment tools...

  1. Environmental impact assessment of coal power plants in operation

    Directory of Open Access Journals (Sweden)

    Bartan Ayfer

    2017-01-01

    Full Text Available Coal power plants constitute an important component of the energy mix in many countries. However, coal power plants can cause several environmental risks such as: climate change and biodiversity loss. In this study, a tool has been proposed to calculate the environmental impact of a coal-fired thermal power plant in operation by using multi-criteria scoring and fuzzy logic method. We take into account the following environmental parameters in our tool: CO, SO2, NOx, particulate matter, fly ash, bottom ash, the cooling water intake impact on aquatic biota, and the thermal pollution. In the proposed tool, the boundaries of the fuzzy logic membership functions were established taking into account the threshold values of the environmental parameters which were defined in the environmental legislation. Scoring of these environmental parameters were done with the statistical analysis of the environmental monitoring data of the power plant and by using the documented evidences that were obtained during the site visits. The proposed method estimates each environmental impact factor level separately and then aggregates them by calculating the Environmental Impact Score (EIS. The proposed method uses environmental monitoring data and documented evidence instead of using simulation models. The proposed method has been applied to the 4 coal-fired power plants that have been operation in Turkey. The Environmental Impact Score was obtained for each power plant and their environmental performances were compared. It is expected that those environmental impact assessments will contribute to the decision-making process for environmental investments to those plants. The main advantage of the proposed method is its flexibility and ease of use.

  2. Environmental impact assessment of coal power plants in operation

    Science.gov (United States)

    Bartan, Ayfer; Kucukali, Serhat; Ar, Irfan

    2017-11-01

    Coal power plants constitute an important component of the energy mix in many countries. However, coal power plants can cause several environmental risks such as: climate change and biodiversity loss. In this study, a tool has been proposed to calculate the environmental impact of a coal-fired thermal power plant in operation by using multi-criteria scoring and fuzzy logic method. We take into account the following environmental parameters in our tool: CO, SO2, NOx, particulate matter, fly ash, bottom ash, the cooling water intake impact on aquatic biota, and the thermal pollution. In the proposed tool, the boundaries of the fuzzy logic membership functions were established taking into account the threshold values of the environmental parameters which were defined in the environmental legislation. Scoring of these environmental parameters were done with the statistical analysis of the environmental monitoring data of the power plant and by using the documented evidences that were obtained during the site visits. The proposed method estimates each environmental impact factor level separately and then aggregates them by calculating the Environmental Impact Score (EIS). The proposed method uses environmental monitoring data and documented evidence instead of using simulation models. The proposed method has been applied to the 4 coal-fired power plants that have been operation in Turkey. The Environmental Impact Score was obtained for each power plant and their environmental performances were compared. It is expected that those environmental impact assessments will contribute to the decision-making process for environmental investments to those plants. The main advantage of the proposed method is its flexibility and ease of use.

  3. Appropriate statistical methods are required to assess diagnostic tests for replacement, add-on, and triage

    NARCIS (Netherlands)

    Hayen, Andrew; Macaskill, Petra; Irwig, Les; Bossuyt, Patrick

    2010-01-01

    To explain which measures of accuracy and which statistical methods should be used in studies to assess the value of a new binary test as a replacement test, an add-on test, or a triage test. Selection and explanation of statistical methods, illustrated with examples. Statistical methods for

  4. Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd

    2011-01-01

    The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…

  5. Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve

    International Nuclear Information System (INIS)

    Sudret, B.; Hornet, P.; Stephan, J.-M.; Guede, Z.; Lemaire, M.

    2003-01-01

    A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)

  6. Computer-aided assessment in statistics: the CAMPUS project

    Directory of Open Access Journals (Sweden)

    Neville Hunt

    1998-12-01

    Full Text Available The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assessor. While computer-aided assessment (CAA has an enthusiastic following, it is not clear to many teachers that it either reduces workloads or reduces the risk of cheating. In an ideal world, most teachers would prefer to give individual attention and personal feedback to each student when marking their work. In this sense CAA must be seen as second best and will therefore be used only if it is seen to offer significant benefits in terms of reduced workloads or increased validity.

  7. In vivo Comet assay – statistical analysis and power calculations of mice testicular cells

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne

    2014-01-01

    is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636....... A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most...... consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells....

  8. Statistical assessment of coal charge effect on metallurgical coke quality

    Directory of Open Access Journals (Sweden)

    Pavlína Pustějovská

    2016-06-01

    Full Text Available The paper studies coke quality. Blast furnace technique has been interested in iron ore charge; meanwhile coke was not studied because, in previous conditions, it seemed to be good enough. Nowadays, requirements for blast furnace coke has risen, especially, requirements for coke reactivity. The level of reactivity parameter is determined primarily by the composition and properties of coal mixtures for coking. The paper deals with a statistical analysis of the tightness and characteristics of the relationship between selected properties of coal mixture and coke reactivity. Software Statgraphic using both simple linear regression and multiple linear regressions was used for the calculations. Obtained regression equations provide a statistically significant prediction of the reactivity of coke, or its strength after reduction of CO2, and, thus, their subsequent management by change in composition and properties of coal mixture. There were determined indexes CSR/CRI for coke. Fifty – four results were acquired in the experimental parts where correlation between index CRI and coal components were studied. For linear regression the determinant was 55.0204%, between parameters CRI – Inertinit 21.5873%. For regression between CRI and coal components it was 31.03%. For multiple linear regression between CRI and 3 feedstock components determinant was 34.0691%. The final correlation has shown the decrease in final coke reactivity for higher ash, higher content of volatile combustible in coal increases the total coke reactivity and higher amount of inertinit in coal increases the reactivity. Generally, coke quality is significantly affected by coal processing, carbonization and maceral content of coal mixture.

  9. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    Science.gov (United States)

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  10. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  11. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    NARCIS (Netherlands)

    Voet, van der H.; Perry, J.N.; Amzal, B.; Paoletti, C.

    2011-01-01

    Background - Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment.

  12. Tools for Assessing Readability of Statistics Teaching Materials

    Science.gov (United States)

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  13. Power flow as a complement to statistical energy analysis and finite element analysis

    Science.gov (United States)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  14. A new Markov-chain-related statistical approach for modelling synthetic wind power time series

    International Nuclear Information System (INIS)

    Pesch, T; Hake, J F; Schröders, S; Allelein, H J

    2015-01-01

    The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)

  15. Assessing Research Data Deposits and Usage Statistics within IDEALS

    Directory of Open Access Journals (Sweden)

    Christie A. Wiley

    2017-12-01

    Full Text Available Objectives:This study follows up on previous work that began examining data deposited in an institutional repository. The work here extends the earlier study by answering the following lines of research questions: (1 What is the file composition of datasets ingested into the University of Illinois at Urbana-Champaign (UIUC campus repository? Are datasets more likely to be single-file or multiple-file items? (2 What is the usage data associated with these datasets? Which items are most popular? Methods: The dataset records collected in this study were identified by filtering item types categorized as “data” or “dataset” using the advanced search function in IDEALS. Returned search results were collected in an Excel spreadsheet to include data such as the Handle identifier, date ingested, file formats, composition code, and the download count from the item’s statistics report. The Handle identifier represents the dataset record’s persistent identifier. Composition represents codes that categorize items as single or multiple file deposits. Date available represents the date the dataset record was published in the campus repository. Download statistics were collected via a website link for each dataset record and indicates the number of times the dataset record has been downloaded. Once the data was collected, it was used to evaluate datasets deposited into IDEALS. Results: A total of 522 datasets were identified for analysis covering the period between January 2007 and August 2016. This study revealed two influxes occurring during the period of 2008-2009 and in 2014. During the first timeframe a large number of PDFs were deposited by the Illinois Department of Agriculture. Whereas, Microsoft Excel files were deposited in 2014 by the Rare Books and Manuscript Library. Single-file datasets clearly dominate the deposits in the campus repository. The total download count for all datasets was 139,663 and the average downloads per month per

  16. WATER POLO GAME-RELATED STATISTICS IN WOMEN'S INTERNATIONAL CHAMPIONSHIPS: DIFFERENCES AND DISCRIMINATORY POWER

    Directory of Open Access Journals (Sweden)

    Yolanda Escalante

    2012-09-01

    Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots

  17. A Powerful Approach to Estimating Annotation-Stratified Genetic Covariance via GWAS Summary Statistics.

    Science.gov (United States)

    Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu

    2017-12-07

    Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  18. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  19. Using DEWIS and R for Multi-Staged Statistics e-Assessments

    Science.gov (United States)

    Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.

    2016-01-01

    We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…

  20. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    International Nuclear Information System (INIS)

    Pevey, Ronald E.

    2005-01-01

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL

  1. Development of a statistical oil spill model for risk assessment.

    Science.gov (United States)

    Guo, Weijun

    2017-11-01

    To gain a better understanding of the impacts from potential risk sources, we developed an oil spill model using probabilistic method, which simulates numerous oil spill trajectories under varying environmental conditions. The statistical results were quantified from hypothetical oil spills under multiple scenarios, including area affected probability, mean oil slick thickness, and duration of water surface exposed to floating oil. The three sub-indices together with marine area vulnerability are merged to compute the composite index, characterizing the spatial distribution of risk degree. Integral of the index can be used to identify the overall risk from an emission source. The developed model has been successfully applied in comparison to and selection of an appropriate oil port construction location adjacent to a marine protected area for Phoca largha in China. The results highlight the importance of selection of candidates before project construction, since that risk estimation from two adjacent potential sources may turn out to be significantly different regarding hydrodynamic conditions and eco-environmental sensitivity. Copyright © 2017. Published by Elsevier Ltd.

  2. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    Directory of Open Access Journals (Sweden)

    L. Palatella

    2010-07-01

    Full Text Available In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy, Ebro river basin (Spain, Po valley (Italy and Antalya province (Turkey. We performed the statistical downscaling using Canonical Correlation Analysis (CCA in two versions: in one case Principal Component Analysis (PCA filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500 as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.

  3. Statistical testing and power analysis for brain-wide association study.

    Science.gov (United States)

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Generation of statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2007-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...

  5. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2009-01-01

    on the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate closures......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...

  6. The statistical power to detect cross-scale interactions at macroscales

    Science.gov (United States)

    Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.

    2016-01-01

    Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.

  7. Dose assessments in nuclear power plant siting

    International Nuclear Information System (INIS)

    1988-03-01

    This document is mainly intended to provide information on dose estimations and assessments for the purpose of nuclear power plant (NPP) siting. It is not aimed at giving radiation protection guidance, criteria or procedures to be applied during the process of NPP siting nor even to provide recommendations on this subject matter. The document may however be of help for implementing some of the Nuclear Safety Standards (NUSS) documents on siting. The document was prepared before April 26, 1986, when a severe accident at the Unit 4 of Chernobyl NPP in the USSR had occurred. It should be emphasized that this document does not bridge the gap which exists in the NUSS programme as far as radiation protection guidance for the specific case of siting of NPP is concerned. The Agency will continue to work on this subject with the aim to prepare a safety series document on radiation protection requirements for NPP siting. This document could serve as a working document for this purpose. Refs, figs and tabs

  8. Addressing the "Replication Crisis": Using Original Studies to Design Replication Studies with Appropriate Statistical Power.

    Science.gov (United States)

    Anderson, Samantha F; Maxwell, Scott E

    2017-01-01

    Psychology is undergoing a replication crisis. The discussion surrounding this crisis has centered on mistrust of previous findings. Researchers planning replication studies often use the original study sample effect size as the basis for sample size planning. However, this strategy ignores uncertainty and publication bias in estimated effect sizes, resulting in overly optimistic calculations. A psychologist who intends to obtain power of .80 in the replication study, and performs calculations accordingly, may have an actual power lower than .80. We performed simulations to reveal the magnitude of the difference between actual and intended power based on common sample size planning strategies and assessed the performance of methods that aim to correct for effect size uncertainty and/or bias. Our results imply that even if original studies reflect actual phenomena and were conducted in the absence of questionable research practices, popular approaches to designing replication studies may result in a low success rate, especially if the original study is underpowered. Methods correcting for bias and/or uncertainty generally had higher actual power, but were not a panacea for an underpowered original study. Thus, it becomes imperative that 1) original studies are adequately powered and 2) replication studies are designed with methods that are more likely to yield the intended level of power.

  9. Fast Computation and Assessment Methods in Power System Analysis

    Science.gov (United States)

    Nagata, Masaki

    Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.

  10. Assessment of ceramic composites for MMW space nuclear power systems

    International Nuclear Information System (INIS)

    Besmann, T.M.

    1987-01-01

    Proposed multimegawatt nuclear power systems which operate at high temperatures, high levels of stress, and in hostile environments, including corrosive working fluids, have created interest in the use of ceramic composites as structural materials. This report assesses the applicability of several ceramic composites in both Brayton and Rankine cycle power systems. This assessment considers an equilibrium thermodynamic analysis and also a nonequilibrium assessment. (FI)

  11. Classification of Underlying Causes of Power Quality Disturbances: Deterministic versus Statistical Methods

    Directory of Open Access Journals (Sweden)

    Emmanouil Styvaktakis

    2007-01-01

    Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.

  12. Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads

    Energy Technology Data Exchange (ETDEWEB)

    Olama, Mohammed M [ORNL; Sharma, Isha [ORNL; Kuruganti, Teja [ORNL; Fugate, David L [ORNL

    2017-01-01

    In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis of building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.

  13. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    Science.gov (United States)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  14. A powerful score-based test statistic for detecting gene-gene co-association.

    Science.gov (United States)

    Xu, Jing; Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Li, Hongkai; Wu, Xuesen; Xue, Fuzhong; Liu, Yanxun

    2016-01-29

    The genetic variants identified by Genome-wide association study (GWAS) can only account for a small proportion of the total heritability for complex disease. The existence of gene-gene joint effects which contains the main effects and their co-association is one of the possible explanations for the "missing heritability" problems. Gene-gene co-association refers to the extent to which the joint effects of two genes differ from the main effects, not only due to the traditional interaction under nearly independent condition but the correlation between genes. Generally, genes tend to work collaboratively within specific pathway or network contributing to the disease and the specific disease-associated locus will often be highly correlated (e.g. single nucleotide polymorphisms (SNPs) in linkage disequilibrium). Therefore, we proposed a novel score-based statistic (SBS) as a gene-based method for detecting gene-gene co-association. Various simulations illustrate that, under different sample sizes, marginal effects of causal SNPs and co-association levels, the proposed SBS has the better performance than other existed methods including single SNP-based and principle component analysis (PCA)-based logistic regression model, the statistics based on canonical correlations (CCU), kernel canonical correlation analysis (KCCU), partial least squares path modeling (PLSPM) and delta-square (δ (2)) statistic. The real data analysis of rheumatoid arthritis (RA) further confirmed its advantages in practice. SBS is a powerful and efficient gene-based method for detecting gene-gene co-association.

  15. Air-chemistry "turbulence": power-law scaling and statistical regularity

    Directory of Open Access Journals (Sweden)

    H.-m. Hsu

    2011-08-01

    Full Text Available With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3 and aerosol (PM10 measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days peaks are also identified with the continuous wavelet transform (CWT. The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of −5/3 and −1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, −3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.

  16. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    Science.gov (United States)

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  17. How Teachers Understand and Use Power in Alternative Assessment

    Directory of Open Access Journals (Sweden)

    Kelvin H. K. Tan

    2012-01-01

    Full Text Available “Alternative assessment” is an increasingly common and popular discourse in education. The potential benefit of alternative assessment practices is premised on significant changes in assessment practices. However, assessment practices embody power relations between institutions, teachers and students, and these power relationships determine the possibility and the extent of actual changes in assessment practices. Labelling a practice as “alternative assessment does not guarantee meaningful departure from existing practice. Recent research has warned that assessment practices in education cannot be presumed to empower students in ways that enhance their learning. This is partly due to a tendency to speak of power in assessment in undefined terms. Hence, it would be useful to identify the types of power present in assessment practices and the contexts which give rise to them. This paper seeks to examine power in the context of different ways that alternative assessment is practiced and understood by teachers. Research on teachers’ conceptions of alternative assessment is presented, and each of the conceptions is then analysed for insights into teachers’ meanings and practices of power. In particular, instances of sovereign, epistemological and disciplinary power in alternative assessment are identified to illuminate new ways of understanding and using alternative assessment.

  18. Methodology for Assessment of Inertial Response from Wind Power Plants

    DEFF Research Database (Denmark)

    Altin, Müfit; Teodorescu, Remus; Bak-Jensen, Birgitte

    2012-01-01

    High wind power penetration levels result in additional requirements from wind power in order to improve frequency stability. Replacement of conventional power plants with wind power plants reduces the power system inertia due to the wind turbine technology. Consequently, the rate of change...... of frequency and the maximum frequency deviation increase after a disturbance such as generation loss, load increase, etc. Having no inherent inertial response, wind power plants need additional control concepts in order to provide an additional active power following a disturbance. Several control concepts...... have been implemented in the literature, but the assessment of these control concepts with respect to power system requirements has not been specified. In this paper, a methodology to assess the inertial response from wind power plants is proposed. Accordingly, the proposed methodology is applied...

  19. A generalized model to estimate the statistical power in mitochondrial disease studies involving 2×k tables.

    Directory of Open Access Journals (Sweden)

    Jacobo Pardo-Seco

    Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.

  20. Assessing biomass of diverse coastal marsh ecosystems using statistical and machine learning models

    Science.gov (United States)

    Mo, Yu; Kearney, Michael S.; Riter, J. C. Alexis; Zhao, Feng; Tilley, David R.

    2018-06-01

    The importance and vulnerability of coastal marshes necessitate effective ways to closely monitor them. Optical remote sensing is a powerful tool for this task, yet its application to diverse coastal marsh ecosystems consisting of different marsh types is limited. This study samples spectral and biophysical data from freshwater, intermediate, brackish, and saline marshes in Louisiana, and develops statistical and machine learning models to assess the marshes' biomass with combined ground, airborne, and spaceborne remote sensing data. It is found that linear models derived from NDVI and EVI are most favorable for assessing Leaf Area Index (LAI) using multispectral data (R2 = 0.7 and 0.67, respectively), and the random forest models are most useful in retrieving LAI and Aboveground Green Biomass (AGB) using hyperspectral data (R2 = 0.91 and 0.84, respectively). It is also found that marsh type and plant species significantly impact the linear model development (P biomass of Louisiana's coastal marshes using various optical remote sensing techniques, and highlights the impacts of the marshes' species composition on the model development and the sensors' spatial resolution on biomass mapping, thereby providing useful tools for monitoring the biomass of coastal marshes in Louisiana and diverse coastal marsh ecosystems elsewhere.

  1. Dark matter statistics for large galaxy catalogs: power spectra and covariance matrices

    Science.gov (United States)

    Klypin, Anatoly; Prada, Francisco

    2018-06-01

    Large-scale surveys of galaxies require accurate theoretical predictions of the dark matter clustering for thousands of mock galaxy catalogs. We demonstrate that this goal can be achieve with the new Parallel Particle-Mesh (PM) N-body code GLAM at a very low computational cost. We run ˜22, 000 simulations with ˜2 billion particles that provide ˜1% accuracy of the dark matter power spectra P(k) for wave-numbers up to k ˜ 1hMpc-1. Using this large data-set we study the power spectrum covariance matrix. In contrast to many previous analytical and numerical results, we find that the covariance matrix normalised to the power spectrum C(k, k΄)/P(k)P(k΄) has a complex structure of non-diagonal components: an upturn at small k, followed by a minimum at k ≈ 0.1 - 0.2 hMpc-1, and a maximum at k ≈ 0.5 - 0.6 hMpc-1. The normalised covariance matrix strongly evolves with redshift: C(k, k΄)∝δα(t)P(k)P(k΄), where δ is the linear growth factor and α ≈ 1 - 1.25, which indicates that the covariance matrix depends on cosmological parameters. We also show that waves longer than 1h-1Gpc have very little impact on the power spectrum and covariance matrix. This significantly reduces the computational costs and complexity of theoretical predictions: relatively small volume ˜(1h-1Gpc)3 simulations capture the necessary properties of dark matter clustering statistics. As our results also indicate, achieving ˜1% errors in the covariance matrix for k < 0.50 hMpc-1 requires a resolution better than ɛ ˜ 0.5h-1Mpc.

  2. Theoretical remarks on the statistics of three discriminants in Piety's automated signature analysis of PSD [Power Spectral Density] data

    International Nuclear Information System (INIS)

    Behringer, K.; Spiekerman, G.

    1984-01-01

    Piety (1977) proposed an automated signature analysis of power spectral density data. Eight statistical decision discriminants are introduced. For nearly all the discriminants, improved confidence statements can be made. The statistical characteristics of the last three discriminants, which are applications of non-parametric tests, are considered. (author)

  3. The influence of the presence of deviant item score patterns on the power of a person-fit statistic

    NARCIS (Netherlands)

    Meijer, R.R.

    1994-01-01

    In studies investigating the power of person-fit statistics it is often assumed that the item parameters that are used to calculate the statistics can be estimated in a sample without aberrant persons. However, in practical test applications calibration samples most likely will contain aberrant

  4. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  5. Citizen preference assessment for power supply visions using choice experiments

    International Nuclear Information System (INIS)

    Nakatani, Jun; Tahara, Kiyotaka; Tanaka, Koji; Matsumoto, Shinya; Mizuno, Tateki

    2015-01-01

    In this paper, citizen preferences for power supply visions were assessed using choice experiments. In particular, preferences for the composition of power generation including renewable energy and nuclear power were analyzed. We also investigated how the need and consciousness for electricity saving affected the preferences for power supply visions. The results indicated that a respondent group who felt negative about resuming the operations at nuclear power plants had discriminative preferences for attributes of the power supply visions, and that the priority of carbon dioxide emissions as a criterion for evaluating the power supply visions became lower when the composition of power generation was presented. Consciousness for electricity saving, as well as preferences for nuclear power generation, differed depending on regions of residence, while their relationship was similar among respondent groups who lived in the jurisdictional areas of the electric power companies that had experienced risks of demand-supply gaps. (author)

  6. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  7. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Hysteresis and Power-Law Statistics during temperature induced martensitic transformation

    International Nuclear Information System (INIS)

    Paul, Arya; Sengupta, Surajit; Rao, Madan

    2011-01-01

    We study hysteresis in temperature induced martensitic transformation using a 2D model solid exhibiting a square to rhombic structural transition. We find that upon quenching, the high temperature square phase, martensites are nucleated at sites having large non-affineness and ultimately invades the whole of the high temperature square phase. On heating the martensite, the high temperature square phase is restored. The transformation proceeds through avalanches. The amplitude and the time-duration of these avalanches exhibit power-law statistics both during heating and cooling of the system. The exponents corresponding to heating and cooling are different thereby indicating that the nucleation and dissolution of the product phase follows different transformation mechanism.

  9. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t

    2012-01-01

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  10. Novel approach to assess local market power considering transmission constraints

    International Nuclear Information System (INIS)

    Li, Canbing; Xia, Qing; Kang, Chongqing; Jiang, Jianjian

    2008-01-01

    Market power (MP) assessment and mitigation affect the efficiency of the generation market. The traditional indices such as HHI and Lerner index can not express local market power, which caused by transmission constraints. Transmission constraints divide the market into some smaller parts. Some generators can abuse their MP in one part but not in the whole market. This paper describes a new approach to assess market power. The main contributions of the new method can be summarized as following. First, the concept of local market is developed, and the whole power system is divided into several local markets, as transmission congestions dividing the market. In the local markets, there are no transmission constraints so local market power does not exist. Then the local market power index (LMPI) is calculated according to market concentration, transmission constraints, and demand-supply ratio. Based on LMPI, the integrated local market power index which describes the whole picture of market can be obtained. It has been proved that the new approach can assess market power exactly, and identify the critical factor that results in market power and where generators are easy to exercise market power. The finding in this paper is helpful for market monitoring and mitigating market power. Moreover, the new index can be used to evaluate the power grid availability to generation competition and the power transmission expansion planning. (author)

  11. Assessment of Power Quality Problems for TRIGA PUSPATI Reactor (RTP)

    International Nuclear Information System (INIS)

    Mohd Fazli Zakaria; Ramachandaramurthy, V.K.

    2016-01-01

    The electrical power systems are exposed to different types of power quality disturbances. Investigation and monitoring of power quality is necessary to maintain accurate operation of sensitive equipment especially for nuclear installations. This paper will discuss the power quality problems observed at the electrical sources of PUSPATI TRIGA Reactor (RTP). Assessment of power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards) then, if problems exist, recommendation of mitigation techniques must be considered. Field power quality data is collected by power quality recorder and analyzed with reference to power quality standards. Normally the electrical power is supplied to the RTP via two sources in order to keep a good reliability where each of them is designed to carry the full load. The assessment of power quality during reactor operation was performed for both electrical sources. There were several disturbances such as voltage harmonics and flicker that exceeded the thresholds. (author)

  12. Application of probabilistic safety assessment for Macedonian electric power system

    International Nuclear Information System (INIS)

    Kancev, D.; Causevski, A.; Cepin, M.; Volkanovski, A.

    2007-01-01

    Due to the complex and integrated nature of a power system, failures in any part of the system can cause interruptions, which range from inconveniencing a small number of local residents to a major and widespread catastrophic disruption of supply known as blackout. The objective of the paper is to show that the methods and tools of probabilistic safety assessment are applicable for assessment and improvement of real power systems. The method used in this paper is developed based on the fault tree analysis and is adapted for the power system reliability analysis. A particular power system i.e. the Macedonian power system is the object of the analysis. The results show that the method is suitable for application of real systems. The reliability of Macedonian power system assumed as the static system is assessed. The components, which can significantly impact the power system are identified and analysed in more details. (author)

  13. Statistically based evaluation of toughness properties of components in older nuclear power stations

    International Nuclear Information System (INIS)

    Aurich, D.; Jaenicke, B.; Veith, H.

    1996-01-01

    The KTA code 3201.2 contains provisions for the evaluation of K Ic values measured in components, but there are no instructions on how to proceed. According to the present state of the art in science and technology, fracture toughness values K Ic (T) should be evaluated statistically in order to specify the relationship to the loading values K I (T). The 'Master Curve' concept of Wallin yields too flat a curve shape at high temperatures. The statistical evaluation of K Ic values can also be carried out with the KTA-K Ic reference temperature function assuming a normal distribution of the measuring values. The KTA-K Ic reference temperature curve approximately corresponds to a fracture probability of 5 % when the KTA-K Ic reference temperature function is used for the statistical evaluation of the test results. Conclusions for the assessment of the safe distances can be drawn from the steeper shape of the KTA-K Ic reference temperature function in comparison to the 'Master Curve'. (orig.) [de

  14. Sparse Power-Law Network Model for Reliable Statistical Predictions Based on Sampled Data

    Directory of Open Access Journals (Sweden)

    Alexander P. Kartun-Giles

    2018-04-01

    Full Text Available A projective network model is a model that enables predictions to be made based on a subsample of the network data, with the predictions remaining unchanged if a larger sample is taken into consideration. An exchangeable model is a model that does not depend on the order in which nodes are sampled. Despite a large variety of non-equilibrium (growing and equilibrium (static sparse complex network models that are widely used in network science, how to reconcile sparseness (constant average degree with the desired statistical properties of projectivity and exchangeability is currently an outstanding scientific problem. Here we propose a network process with hidden variables which is projective and can generate sparse power-law networks. Despite the model not being exchangeable, it can be closely related to exchangeable uncorrelated networks as indicated by its information theory characterization and its network entropy. The use of the proposed network process as a null model is here tested on real data, indicating that the model offers a promising avenue for statistical network modelling.

  15. Assessing Power System Stability Following Load Changes and Considering Uncertainty

    Directory of Open Access Journals (Sweden)

    D. V. Ngo

    2018-04-01

    Full Text Available An increase in load capacity during the operation of a power system usually causes voltage drop and leads to system instability, so it is necessary to monitor the effect of load changes. This article presents a method of assessing the power system stability according to the load node capacity considering uncertainty factors in the system. The proposed approach can be applied to large-scale power systems for voltage stability assessment in real-time.

  16. Safety Assessment - Swedish Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kjellstroem, B. [Luleaa Univ. of Technology (Sweden)

    1996-12-31

    After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs.

  17. Safety Assessment - Swedish Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kjellstroem, B.

    1996-01-01

    After the reactor accident at Three Mile Island, the Swedish nuclear power plants were equipped with filtered venting of the containment. Several types of accidents can be identified where the filtered venting has no effect on the radioactive release. The probability for such accidents is hopefully very small. It is not possible however to estimate the probability accurately. Experiences gained in the last years, which have been documented in official reports from the Nuclear Power Inspectorate indicate that the probability for core melt accidents in Swedish reactors can be significantly larger than estimated earlier. A probability up to one in a thousand operating years can not be excluded. There are so far no indications that aging of the plants has contributed to an increased accident risk. Maintaining the safety level with aging nuclear power plants can however be expected to be increasingly difficult. It is concluded that the 12 Swedish plants remain a major threat for severe radioactive pollution of the Swedish environment despite measures taken since 1980 to improve their safety. Closing of the nuclear power plants is the only possibility to eliminate this threat. It is recommended that until this is done, quantitative safety goals, same for all Swedish plants, shall be defined and strictly enforced. It is also recommended that utilities distributing misleading information about nuclear power risks shall have their operating license withdrawn. 37 refs

  18. Implementation of a Model Output Statistics based on meteorological variable screening for short‐term wind power forecast

    DEFF Research Database (Denmark)

    Ranaboldo, Matteo; Giebel, Gregor; Codina, Bernat

    2013-01-01

    A combination of physical and statistical treatments to post‐process numerical weather predictions (NWP) outputs is needed for successful short‐term wind power forecasts. One of the most promising and effective approaches for statistical treatment is the Model Output Statistics (MOS) technique....... The proposed MOS performed well in both wind farms, and its forecasts compare positively with an actual operative model in use at Risø DTU and other MOS types, showing minimum BIAS and improving NWP power forecast of around 15% in terms of root mean square error. Further improvements could be obtained...

  19. Statistical inquiry on the reliability of emergency diesel stations in German nuclear power plants

    International Nuclear Information System (INIS)

    1983-01-01

    This statistic inquiry is based on 692 occurrances in 40 diesel stations of 10 German nuclear power plants. Various parameters influencing the failure behaviour of diesel stations were investigated on only significant plant-specific influences and the impact of diesel station circuitry on failure behaviour were established. According to the results of this inquiry, running time, start-up number and increasing operational experience do not apparently influence the failure behaviour of diesel stations. The expected failure probability of diesel stations varies with the different nuclear power plants. Taking into account both start-up and operational failures, (with monthly inspections and running times of up to 2 h), this value is in the range of 1.6 x 10 -2 to 1.7 x 10 -3 per application. Considering failure data of all diesel stations, the failure probability (start-up and operational failures) is 8.1 x 10 -3 per application. On account of the two common-mode failures registered, a common-mode failure probability of 10 -3 was established. The inquiry also showed that non-availability of diesel stations is essentially determined by maintenance intervals. (orig.) [de

  20. Statistical analysis of wind power in the region of Veracruz (Mexico)

    Energy Technology Data Exchange (ETDEWEB)

    Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)

    2009-06-15

    The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)

  1. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hacke, P.; Spataru, S.

    2014-08-01

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.

  2. Statistical analysis of wind power in the region of Veracruz (Mexico)

    International Nuclear Information System (INIS)

    Cancino-Solorzano, Yoreley; Xiberta-Bernat, Jorge

    2009-01-01

    The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)

  3. Nuclear power plant performance statistics. Comparison with fossil-fired units

    International Nuclear Information System (INIS)

    Tabet, C.; Laue, H.J.; Qureshi, A.; Skjoeldebrand, R.; White, D.

    1983-01-01

    The joint UNIPEDE/World Energy Conference Committee on Availability of Thermal Generating Plants has a mandate to study the availability of thermal plants and the different factors that influence it. This has led to the collection and publication at the Congress of the World Energy Conference (WEC) every third year of availability and unavailability factors to be used in systems reliability studies and operations and maintenance planning. For nuclear power plants the joint UNIPEDE/WEC Committee relies on the IAEA to provide availability and unavailability data. The IAEA has published an annual report with operating data from nuclear plants in its Member States since 1971, covering in addition back data from the early 1960s. These reports have developed over the years and in the early 1970s the format was brought into close conformity with that used by UNIPEDE and WEC to report performance of fossil-fired generating plants. Since 1974 an annual analytical summary report has been prepared. In 1981 all information on operating experience with nuclear power plants was placed in a computer file for easier reference. The computerized Power Reactor Information System (PRIS) ensures that data are easily retrievable and at its present level it remains compatible with various national systems. The objectives for the IAEA data collection and evaluation have developed significantly since 1970. At first, the IAEA primarily wanted to enable the individual power plant operator to compare the performance of his own plant with that of others of the same type; when enough data had been collected, they provided the basis for assessment of the fundamental performance parameters used in economic project studies; now, the data base merits being used in setting availability objectives for power plant operations. (author)

  4. Methods of assessing nuclear power plant risks

    International Nuclear Information System (INIS)

    Skvarka, P.; Kovacz, Z.

    1985-01-01

    The concept of safety evalution is based on safety criteria -standards or set qualitative values of parameters and indices used in designing nuclear power plants, incorporating demands on the quality of equipment and operation of the plant, its siting and technical means for achieving nuclear safety. The concepts are presented of basic and optimal risk values. Factors are summed up indispensable for the evaluation of the nuclear power plant risk and the present world trend of evaluation based on probability is discussed. (J.C.)

  5. Developing a statistically powerful measure for quartet tree inference using phylogenetic identities and Markov invariants.

    Science.gov (United States)

    Sumner, Jeremy G; Taylor, Amelia; Holland, Barbara R; Jarvis, Peter D

    2017-12-01

    Recently there has been renewed interest in phylogenetic inference methods based on phylogenetic invariants, alongside the related Markov invariants. Broadly speaking, both these approaches give rise to polynomial functions of sequence site patterns that, in expectation value, either vanish for particular evolutionary trees (in the case of phylogenetic invariants) or have well understood transformation properties (in the case of Markov invariants). While both approaches have been valued for their intrinsic mathematical interest, it is not clear how they relate to each other, and to what extent they can be used as practical tools for inference of phylogenetic trees. In this paper, by focusing on the special case of binary sequence data and quartets of taxa, we are able to view these two different polynomial-based approaches within a common framework. To motivate the discussion, we present three desirable statistical properties that we argue any invariant-based phylogenetic method should satisfy: (1) sensible behaviour under reordering of input sequences; (2) stability as the taxa evolve independently according to a Markov process; and (3) explicit dependence on the assumption of a continuous-time process. Motivated by these statistical properties, we develop and explore several new phylogenetic inference methods. In particular, we develop a statistically bias-corrected version of the Markov invariants approach which satisfies all three properties. We also extend previous work by showing that the phylogenetic invariants can be implemented in such a way as to satisfy property (3). A simulation study shows that, in comparison to other methods, our new proposed approach based on bias-corrected Markov invariants is extremely powerful for phylogenetic inference. The binary case is of particular theoretical interest as-in this case only-the Markov invariants can be expressed as linear combinations of the phylogenetic invariants. A wider implication of this is that, for

  6. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Effectiveness of mouse minute virus inactivation by high temperature short time treatment technology: a statistical assessment.

    Science.gov (United States)

    Murphy, Marie; Quesada, Guillermo Miro; Chen, Dayue

    2011-11-01

    Viral contamination of mammalian cell cultures in GMP manufacturing facility represents a serious safety threat to biopharmaceutical industry. Such adverse events usually require facility shutdown for cleaning/decontamination, and thus result in significant loss of production and/or delay of product development. High temperature short time (HTST) treatment of culture media has been considered as an effective method to protect GMP facilities from viral contaminations. Log reduction factor (LRF) has been commonly used to measure the effectiveness of HTST treatment for viral inactivation. However, in order to prevent viral contaminations, HTST treatment must inactivate all infectious viruses (100%) in the medium batch since a single virus is sufficient to cause contamination. Therefore, LRF may not be the most appropriate indicator for measuring the effectiveness of HTST in preventing viral contaminations. We report here the use of the probability to achieve complete (100%) virus inactivation to assess the effectiveness of HTST treatment. By using mouse minute virus (MMV) as a model virus, we have demonstrated that the effectiveness of HTST treatment highly depends upon the level of viral contaminants in addition to treatment temperature and duration. We believe that the statistical method described in this report can provide more accurate information about the power and potential limitation of technologies such as HTST in our shared quest to mitigate the risk of viral contamination in manufacturing facilities. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  8. Assessment and financing of electric power projects

    International Nuclear Information System (INIS)

    Moscote, R.A.

    1976-01-01

    The aim of the appraisal of a project is to examine the economic need which a project is designed to meet, to judge whether the project is likely to meet this need in an efficient way, and to conclude what conditions should be attached to eventual Bank financing. Bank involvement continues throughout the life of the project helping to ensure that each project is carried out at the least possible cost and that it makes the expected contribution to the country's development. This paper gives an idea about the origin, nature and functions of the World Bank Group, describes the criteria used by the Bank in its power project appraisals, discusses the Bank's views on nuclear power, and concludes with a review of past lending and probable future sources of financing of electrical expansion in the less developed countries. (orig./UA) [de

  9. Specification of life cycle assessment in nuclear power plants

    International Nuclear Information System (INIS)

    Abbaspour, M.; Kargari, N.; Mastouri, R.

    2008-01-01

    Life Cycle Assessment is an environmental management tool for assessing the environmental impacts of a product of a process. life cycle assessment involves the evaluation of environmental impacts through all stages of life cycle of a product or process. In other words life cycle assessment has a c radle to grave a pproach. Some results of life cycle assessment consist of pollution prevention, energy efficient system, material conservation, economic system and sustainable development. All power generation technologies affect the environment in one way or another. The main environmental impact does not always occur during operation of power plant. The life cycle assessment of nuclear power has entailed studying the entire fuel cycle from mine to deep repository, as well as the construction, operation and demolition of the power station. Nuclear power plays an important role in electricity production for several countries. even though the use of nuclear power remains controversial. But due to the shortage of fossil fuel energy resources many countries have started to try more alternation to their sources of energy production. A life cycle assessment could detect all environmental impacts of nuclear power from extracting resources, building facilities and transporting material through the final conversion to useful energy services

  10. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    Science.gov (United States)

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  11. Statistical Methods for Assessments in Simulations and Serious Games. Research Report. ETS RR-14-12

    Science.gov (United States)

    Fu, Jianbin; Zapata, Diego; Mavronikolas, Elia

    2014-01-01

    Simulation or game-based assessments produce outcome data and process data. In this article, some statistical models that can potentially be used to analyze data from simulation or game-based assessments are introduced. Specifically, cognitive diagnostic models that can be used to estimate latent skills from outcome data so as to scale these…

  12. An assessment of machine and statistical learning approaches to inferring networks of protein-protein interactions

    Directory of Open Access Journals (Sweden)

    Browne Fiona

    2006-12-01

    Full Text Available Protein-protein interactions (PPI play a key role in many biological systems. Over the past few years, an explosion in availability of functional biological data obtained from high-throughput technologies to infer PPI has been observed. However, results obtained from such experiments show high rates of false positives and false negatives predictions as well as systematic predictive bias. Recent research has revealed that several machine and statistical learning methods applied to integrate relatively weak, diverse sources of large-scale functional data may provide improved predictive accuracy and coverage of PPI. In this paper we describe the effects of applying different computational, integrative methods to predict PPI in Saccharomyces cerevisiae. We investigated the predictive ability of combining different sets of relatively strong and weak predictive datasets. We analysed several genomic datasets ranging from mRNA co-expression to marginal essentiality. Moreover, we expanded an existing multi-source dataset from S. cerevisiae by constructing a new set of putative interactions extracted from Gene Ontology (GO- driven annotations in the Saccharomyces Genome Database. Different classification techniques: Simple Naive Bayesian (SNB, Multilayer Perceptron (MLP and K-Nearest Neighbors (KNN were evaluated. Relatively simple classification methods (i.e. less computing intensive and mathematically complex, such as SNB, have been proven to be proficient at predicting PPI. SNB produced the “highest” predictive quality obtaining an area under Receiver Operating Characteristic (ROC curve (AUC value of 0.99. The lowest AUC value of 0.90 was obtained by the KNN classifier. This assessment also demonstrates the strong predictive power of GO-driven models, which offered predictive performance above 0.90 using the different machine learning and statistical techniques. As the predictive power of single-source datasets became weaker MLP and SNB performed

  13. Survey design, statistical analysis, and basis for statistical inferences in coastal habitat injury assessment: Exxon Valdez oil spill

    International Nuclear Information System (INIS)

    McDonald, L.L.; Erickson, W.P.; Strickland, M.D.

    1995-01-01

    The objective of the Coastal Habitat Injury Assessment study was to document and quantify injury to biota of the shallow subtidal, intertidal, and supratidal zones throughout the shoreline affected by oil or cleanup activity associated with the Exxon Valdez oil spill. The results of these studies were to be used to support the Trustee's Type B Natural Resource Damage Assessment under the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). A probability based stratified random sample of shoreline segments was selected with probability proportional to size from each of 15 strata (5 habitat types crossed with 3 levels of potential oil impact) based on those data available in July, 1989. Three study regions were used: Prince William Sound, Cook Inlet/Kenai Peninsula, and Kodiak/Alaska Peninsula. A Geographic Information System was utilized to combine oiling and habitat data and to select the probability sample of study sites. Quasi-experiments were conducted where randomly selected oiled sites were compared to matched reference sites. Two levels of statistical inferences, philosophical bases, and limitations are discussed and illustrated with example data from the resulting studies. 25 refs., 4 figs., 1 tab

  14. Knowledge based system for fouling assessment of power plant boiler

    International Nuclear Information System (INIS)

    Afgan, N.H.; He, X.; Carvalho, M.G.; Azevedo, J.L.T.

    1999-01-01

    The paper presents the design of an expert system for fouling assessment in power plant boilers. It is an on-line expert system based on selected criteria for the fouling assessment. Using criteria for fouling assessment based on 'clean' and 'not-clean' radiation heat flux measurements, the diagnostic variable are defined for the boiler heat transfer surface. The development of the prototype knowledge-based system for fouling assessment in power plants boiler comprise the integrations of the elements including knowledge base, inference procedure and prototype configuration. Demonstration of the prototype knowledge-based system for fouling assessment was performed on the Sines power plant. It is a 300 MW coal fired power plant. 12 fields are used with 3 on each side of boiler

  15. [Development of "assessment guideline of family power for healthy life"].

    Science.gov (United States)

    Fukushima, M; Shimanouchi, S; Kamei, T; Takagai, E; Hoshino, Y; Sugiyama, I

    1997-12-01

    The purpose of this study is to develop "assessment guideline of family power for healthy life" aiming at expanding self-care power of family in community nursing practice. The subjects of this study covered those families in one hundred and fifty six instances that we had seized as subject for nursing care and study. The method of this study had constructed assessment guideline inductively out of each case, and modified it by applying to cases of families with health problems and others. As a result, we had formed nine items of "family power for healthy life" and three items of "conditions influencing family power for healthy life" for "assessment guideline of family power for healthy life".

  16. Statistical analysis of nuclear power plant pump failure rate variability: some preliminary results

    International Nuclear Information System (INIS)

    Martz, H.F.; Whiteman, D.E.

    1984-02-01

    In-Plant Reliability Data System (IPRDS) pump failure data on over 60 selected pumps in four nuclear power plants are statistically analyzed using the Failure Rate Analysis Code (FRAC). A major purpose of the analysis is to determine which environmental, system, and operating factors adequately explain the variability in the failure data. Catastrophic, degraded, and incipient failure severity categories are considered for both demand-related and time-dependent failures. For catastrophic demand-related pump failures, the variability is explained by the following factors listed in their order of importance: system application, pump driver, operating mode, reactor type, pump type, and unidentified plant-specific influences. Quantitative failure rate adjustments are provided for the effects of these factors. In the case of catastrophic time-dependent pump failures, the failure rate variability is explained by three factors: reactor type, pump driver, and unidentified plant-specific influences. Finally, point and confidence interval failure rate estimates are provided for each selected pump by considering the influential factors. Both types of estimates represent an improvement over the estimates computed exclusively from the data on each pump

  17. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    OpenAIRE

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of...

  18. Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft

    Science.gov (United States)

    Gross, D.; Miller, D. R.; Soland, R. M.

    1980-01-01

    The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.

  19. Statistical Models to Assess the Health Effects and to Forecast Ground Level Ozone

    Czech Academy of Sciences Publication Activity Database

    Schlink, U.; Herbath, O.; Richter, M.; Dorling, S.; Nunnari, G.; Cawley, G.; Pelikán, Emil

    2006-01-01

    Roč. 21, č. 4 (2006), s. 547-558 ISSN 1364-8152 R&D Projects: GA AV ČR 1ET400300414 Institutional research plan: CEZ:AV0Z10300504 Keywords : statistical models * ground level ozone * health effects * logistic model * forecasting * prediction performance * neural network * generalised additive model * integrated assessment Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.992, year: 2006

  20. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    Science.gov (United States)

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological

  1. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  2. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  3. Assessing the Army Power and Energy Efforts for the Warfighter

    Science.gov (United States)

    2011-03-01

    term. Details are in Appendix B. The report places energy challenges in three categories: greatest use, greatest difficulty, and greatest impact ...Power and energy testing Silicon carbide Two new energy facilities New types of solar photovoltaic systems Smaller, lighter cogeneration and...Assessing the Army Power and Energy Efforts for the Warfighter John W. Lyons, Richard Chait, and James J. Valdes

  4. Application of a Statistical Linear Time-Varying System Model of High Grazing Angle Sea Clutter for Computing Interference Power

    Science.gov (United States)

    2017-12-08

    STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found

  5. The assessment of tornado missile hazard to nuclear power plants

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1983-01-01

    Numerical methods and computer codes for assessing tornado missile hazards to nuclear power plants are developed. Due to the uncertainty and randomness of tornado and tornado-generated missiles' characteristics, the damage probability of targets has a highly spread distribution. The proposed method is useful for assessing the risk of not providing protection to some nonsafety-related targets whose failure can create a hazard to the safe operation of nuclear power plants

  6. Method for assessing wind power integration in a hydro based power system

    International Nuclear Information System (INIS)

    Norheim, I.; Palsson, M.; Tande, J.O.G.; Uhlen, K.

    2006-01-01

    The present paper demonstrates a method for assessment of how much wind power that can be integrated in a system with limited transmission capacity. Based on hydro inflow data and wind measurements (for different locations of planned wind farms in an area) it is possible to assess how much wind power that can be fed into a certain point in the transmission network without violating the transmission capacity limits. The proposed method combines the use of market modelling and detailed network analysis in order to assess the probability of network congestions rather than focusing on extreme cases. By computing the probability distribution of power flow on critical corridors in the grid it is possible to assess the likelihood of network congestions and the amount of energy that must be curtailed to fulfil power system security requirements (n-1). This way the assessment is not only made of worst case scenarios, assuming maximal flow from hydro plants and maximal wind power production. As extreme case scenarios are short term and may be solved by market mechanisms or automatic system protection schemes (disconnection of wind power or hydro power), the proposed method may reveal that it would be economic to install more wind power than if only based on analysis of worst case scenarios. (orig.)

  7. Technology assessment Jordan Nuclear Power Plant Project

    International Nuclear Information System (INIS)

    2010-01-01

    Preliminary regional analysis was carried out for identification of potential sites for NPP, followed by screening of these sites and selecting candidate sites. Aqaba sites are proposed, where it can use the sea water for cooling: i.Site 1; at the sea where it can use the sea water for direct cooling. ii.Site 2; 10 km to the east of Gulf of Aqaba shoreline at the Saudi Arabia borders. iii.Site 3, 4 km to the east of Gulf of Aqaba shoreline. Only the granitic basement in the east of the 6 km²site should be considered as a potential site for a NPP. Preliminary probabilistic seismic hazard assessment gives: Operating-Basis Earthquake-OBE (475 years return period) found to be in the range of 0.163-0.182 g; Safe Shutdown Earthquake-SSE (10,000 years return period) found to be in the range of 0.333-0.502g. The process include also setting up of nuclear company and other organizational matters. Regulations in development are: Site approval; Construction permitting; Overall licensing; Safety (design, construction, training, operations, QA); Emergency planning; Decommissioning; Spent fuel and RW management. JAEC's technology assessment strategy and evaluation methodology are presented

  8. Assessment of a satellite power system and six alternative technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L.S.; Levine, E.; Tanzman, E.

    1981-04-01

    The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and insitutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included. (LEW)

  9. Risk Assessment of Power System considering the CPS of Transformers

    Science.gov (United States)

    Zhou, Long; Peng, Zewu; Liu, Xindong; Li, Canbing; Chen, Can

    2018-02-01

    This paper constructs a risk assessment framework of power system for device-level information security, analyzes the typical protection configuration of power transformers, and takes transformer gas protection and differential protection as examples to put forward a method that analyzes the cyber security in electric power system, which targets transformer protection parameters. We estimate the risk of power system accounting for the cyber security of transformer through utilizing Monte Carlo method and two indexes, which are the loss of load probability and the expected demand not supplied. The proposed approach is tested with IEEE 9 bus system and IEEE 118 bus system.

  10. Hybrid algorithm for rotor angle security assessment in power systems

    Directory of Open Access Journals (Sweden)

    D. Prasad Wadduwage

    2015-08-01

    Full Text Available Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on the concept of Lyapunov exponents (LEs to determine the transient security of the post-fault power system. Later, the transient secure power swing curves are analysed using an improved Prony algorithm which extracts the dominant oscillatory modes and estimates their damping ratios. The damping ratio is a security measure about the oscillatory security of the post-fault power system subsequent to the contingency. The suitability of the proposed hybrid algorithm for DSA in power systems is illustrated using different contingencies of a 16-generator 68-bus test system and a 50-generator 470-bus test system. The accuracy of the stability conclusions and the acceptable computational burden indicate that the proposed hybrid algorithm is suitable for real-time security assessment with respect to both transient rotor angle stability and oscillatory rotor angle stability under multiple contingencies of the power system.

  11. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    Science.gov (United States)

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  12. Dynamic vulnerability assessment and intelligent control for sustainable power systems

    CERN Document Server

    Gonzalez-Longatt, Francisco

    2018-01-01

    Identifying, assessing, and mitigating electric power grid vulnerabilities is a growing focus in short-term operational planning of power systems. Through illustrated application, this important guide surveys state-of-the-art methodologies for the assessment and enhancement of power system security in short-term operational planning and real-time operation. The methodologies employ advanced methods from probabilistic theory, data mining, artificial intelligence, and optimization, to provide knowledge-based support for monitoring, control (preventive and corrective), and decision making tasks. Key features: Introduces behavioural recognition in wide-area monitoring and security constrained optimal power flow for intelligent control and protection and optimal grid management. Provides in-depth understanding of risk-based reliability and security assessment, dynamic vulnerability as essment methods, supported by the underpinning mathematics. Develops expertise in mitigation techniques using intelligent protect...

  13. Social assessment and location of nuclear and thermal power plants

    International Nuclear Information System (INIS)

    Nemoto, Kazuyasu; Nishio, Mitsuo.

    1979-01-01

    Most of the locations of nuclear and thermal power plants in Japan are depopulated villages with remote rural character, but for the development of such districts, the policy is not yet clearly established, and the appropriate measures are not taken. The living regions of residents and the production regions of enterprises are more and more estranged. Social assessment is the scientific method to perceive the future change due to the installation of power stations. The features particular to the assessment of natural environment and social environment related to the location of power stations are considered, and the technical problems involved in the method of assessment of natural environment are solved, and the actual method of assessment of social environment is developed. Then, the possibility of establishing this method and the problems in its application are investigated. The plan of developing the surroundings of power generation facilities is criticized, and the coordination of the location plan of power companies and the regional projects of municipalities is discussed. Finally, the mechanism of consensus formation concerning the location of power stations is considered, dividing into regional consensus formation and administrative consensus formation, and the possibility of instituting social assessment is examined. (Kako, I.)

  14. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    Science.gov (United States)

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  15. Satellite Power System (SPS) societal assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-12-01

    Construction and operation of a 60-unit (300 GW) domestic SPS over the period 2000 to 2030 would stress many segments of US society. A significant commitment of resources (land, energy, materials) would be required, and a substantial proportion of them would have to be committed prior to the production of any SPS electricity. Estimated resource demands, however, seem to be within US capabilities. Modifications will be required of institutions called upon to deal with SPS. These include financial, managerial and regulatory entities and, most particularly, the utility industry. Again, the required changes, while certainly profound, seem to be well within the realm of possibility. Enhanced cooperation in international affairs will be necessary to accommodate development and operation of the SPS. To remove its potential as a military threat and to reduce its vulnerability, either the SPS itself must become an international enterprise, or it must be subject to unrestricted international inspection. How either of these objectives could, in fact, be achieved, or which is preferable, remains unclear. Forty-four concerns about the SPS were identified via a public outreach experiment involving 9000 individuals from three special interest organizations. The concerns focused on environmental impacts (particularly the effects of microwave radiation) and the centralizing tendency of the SPS on society. The interim results of the public outreach experiment influenced the scope and direction of the CDEP; the final results will be instrumental in defining further societal assessment efforts.

  16. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  17. Evaluation and assessment of nuclear power plant seismic methodology

    International Nuclear Information System (INIS)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-01-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology

  18. Probabilistic risk assessment in the nuclear power industry

    International Nuclear Information System (INIS)

    Fullwood, R.R.; Hall, R.E.

    1988-01-01

    This book describes the more important improvements in risk assessment methodology developed over the last decade. The book covers the following areas - a general view of risk pertaining to nuclear power, mathematics necessary to understand the text, a concise overview of the light water reactors and their features for protecting the public, probabilities and consequences calculated to form risk assessment to the plant, and 34 applications of probabilistic risk assessment (PRA) in the power generation industry. There is a glossary of acronyms and unusual words and a list of references. (author)

  19. Business Statistics and Management Science Online: Teaching Strategies and Assessment of Student Learning

    Science.gov (United States)

    Sebastianelli, Rose; Tamimi, Nabil

    2011-01-01

    Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…

  20. ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP): WESTERN STREAMS AND RIVERS STATISTICAL SUMMARY

    Science.gov (United States)

    This statistical summary reports data from the Environmental Monitoring and Assessment Program (EMAP) Western Pilot (EMAP-W). EMAP-W was a sample survey (or probability survey, often simply called 'random') of streams and rivers in 12 states of the western U.S. (Arizona, Californ...

  1. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics

    Science.gov (United States)

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...

  2. Sample Size Requirements for Assessing Statistical Moments of Simulated Crop Yield Distributions

    NARCIS (Netherlands)

    Lehmann, N.; Finger, R.; Klein, T.; Calanca, P.

    2013-01-01

    Mechanistic crop growth models are becoming increasingly important in agricultural research and are extensively used in climate change impact assessments. In such studies, statistics of crop yields are usually evaluated without the explicit consideration of sample size requirements. The purpose of

  3. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  4. Using Critical Thinking Drills to Teach and Assess Proficiency in Methodological and Statistical Thinking

    Science.gov (United States)

    Cascio, Ted V.

    2017-01-01

    This study assesses the effectiveness of critical thinking drills (CTDs), a repetitious classroom activity designed to improve methodological and statistical thinking in relation to psychological claims embedded in popular press articles. In each of four separate CTDs, students critically analyzed a brief article reporting a recent psychological…

  5. SIESE - trimestrial bulletin - Synthesis 1994. Electric power summary statistics for Brazil

    International Nuclear Information System (INIS)

    1994-01-01

    The performance of the power system of all the Brazilian electrical utilities is presented. The data is given for each region in the country and includes, among other things, the electric power consumption and generation; the number of consumers and the electric power rates. 10 figs., 42 tabs

  6. Statistical model based iterative reconstruction (MBIR) in clinical CT systems: Experimental assessment of noise performance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ke; Tang, Jie [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 (United States); Chen, Guang-Hong, E-mail: gchen7@wisc.edu [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 and Department of Radiology, University of Wisconsin-Madison, 600 Highland Avenue, Madison, Wisconsin 53792 (United States)

    2014-04-15

    Purpose: To reduce radiation dose in CT imaging, the statistical model based iterative reconstruction (MBIR) method has been introduced for clinical use. Based on the principle of MBIR and its nonlinear nature, the noise performance of MBIR is expected to be different from that of the well-understood filtered backprojection (FBP) reconstruction method. The purpose of this work is to experimentally assess the unique noise characteristics of MBIR using a state-of-the-art clinical CT system. Methods: Three physical phantoms, including a water cylinder and two pediatric head phantoms, were scanned in axial scanning mode using a 64-slice CT scanner (Discovery CT750 HD, GE Healthcare, Waukesha, WI) at seven different mAs levels (5, 12.5, 25, 50, 100, 200, 300). At each mAs level, each phantom was repeatedly scanned 50 times to generate an image ensemble for noise analysis. Both the FBP method with a standard kernel and the MBIR method (Veo{sup ®}, GE Healthcare, Waukesha, WI) were used for CT image reconstruction. Three-dimensional (3D) noise power spectrum (NPS), two-dimensional (2D) NPS, and zero-dimensional NPS (noise variance) were assessed both globally and locally. Noise magnitude, noise spatial correlation, noise spatial uniformity and their dose dependence were examined for the two reconstruction methods. Results: (1) At each dose level and at each frequency, the magnitude of the NPS of MBIR was smaller than that of FBP. (2) While the shape of the NPS of FBP was dose-independent, the shape of the NPS of MBIR was strongly dose-dependent; lower dose lead to a “redder” NPS with a lower mean frequency value. (3) The noise standard deviation (σ) of MBIR and dose were found to be related through a power law of σ ∝ (dose){sup −β} with the component β ≈ 0.25, which violated the classical σ ∝ (dose){sup −0.5} power law in FBP. (4) With MBIR, noise reduction was most prominent for thin image slices. (5) MBIR lead to better noise spatial

  7. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  8. Probabilistic safety assessment in nuclear power plant management

    International Nuclear Information System (INIS)

    Holloway, N.J.

    1989-06-01

    Probabilistic Safety Assessment (PSA) techniques have been widely used over the past few years to assist in understanding how engineered systems respond to abnormal conditions, particularly during a severe accident. The use of PSAs in the design and operation of such systems thus contributes to the safety of nuclear power plants. Probabilistic safety assessments can be maintained to provide a continuous up-to-date assessment (Living PSA), supporting the management of plant operations and modifications

  9. Statistical study of undulator radiated power by a classical detection system in the mm-wave regime

    Directory of Open Access Journals (Sweden)

    A. Eliran

    2009-05-01

    Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.

  10. Kappa statistic to measure agreement beyond chance in free-response assessments.

    Science.gov (United States)

    Carpentier, Marc; Combescure, Christophe; Merlini, Laura; Perneger, Thomas V

    2017-04-19

    The usual kappa statistic requires that all observations be enumerated. However, in free-response assessments, only positive (or abnormal) findings are notified, but negative (or normal) findings are not. This situation occurs frequently in imaging or other diagnostic studies. We propose here a kappa statistic that is suitable for free-response assessments. We derived the equivalent of Cohen's kappa statistic for two raters under the assumption that the number of possible findings for any given patient is very large, as well as a formula for sampling variance that is applicable to independent observations (for clustered observations, a bootstrap procedure is proposed). The proposed statistic was applied to a real-life dataset, and compared with the common practice of collapsing observations within a finite number of regions of interest. The free-response kappa is computed from the total numbers of discordant (b and c) and concordant positive (d) observations made in all patients, as 2d/(b + c + 2d). In 84 full-body magnetic resonance imaging procedures in children that were evaluated by 2 independent raters, the free-response kappa statistic was 0.820. Aggregation of results within regions of interest resulted in overestimation of agreement beyond chance. The free-response kappa provides an estimate of agreement beyond chance in situations where only positive findings are reported by raters.

  11. Statistical analysis of the variation of floor vibrations in nuclear power plants subject to seismic loads

    Energy Technology Data Exchange (ETDEWEB)

    Jussila, Vilho [VTT Technical Research Centre of Finland Ltd, Kemistintie 3, 02230 Espoo (Finland); Li, Yue [Dept. of Civil Engineering, Case Western Reserve University, Cleveland, OH 44106 (United States); Fülöp, Ludovic, E-mail: ludovic.fulop@vtt.fi [VTT Technical Research Centre of Finland Ltd, Kemistintie 3, 02230 Espoo (Finland)

    2016-12-01

    Highlights: • Floor flexibility plays a non-negligible role in amplifying horizontal vibrations. • COV of in-floor horizontal and vertical acceleration are 0.15–0.25 and 0.25–0.55. • In-floor variation of vibrations is higher in lower floors. • Floor spectra from limited nodes underestimates vibrations by a factor of 1.5–1.75. - Abstract: Floor vibration of a reactor building subjected to seismic loads was investigated, with the aim of quantifying the variability of vibrations on each floor. A detailed 3D building model founded on the bedrock was excited simultaneously in three directions by artificial accelerograms compatible with Finnish ground response spectra. Dynamic simulation for 21 s was carried out using explicit time integration. The extracted results of the simulation were acceleration in several floor locations, transformed to pseudo-acceleration (PSA) spectra in the next stage. At first, the monitored locations on the floors were estimated by engineering judgement in order to arrive at a feasible number of floor nodes for post processing of the data. It became apparent that engineering judgment was insufficient to depict the key locations with high floor vibrations, which resulted in un-conservative vibration estimates. For this reason, a more systematic approach was later considered, in which nodes of the floors were selected with a more refined grid of 2 m. With this method, in addition to the highest PSA peaks in all directions, the full vibration distribution in each floor can be determined. A statistical evaluation of the floor responses was also carried out in order to define floor accelerations and PSAs with high confidence of non-exceedance. The conclusion was that in-floor variability can be as high as 50–60% and models with sufficiently dense node grids should be used in order to achieve a realistic estimate of floor vibration under seismic action. The effects of the shape of the input spectra, damping, and flexibility of the

  12. Statistical analysis of the variation of floor vibrations in nuclear power plants subject to seismic loads

    International Nuclear Information System (INIS)

    Jussila, Vilho; Li, Yue; Fülöp, Ludovic

    2016-01-01

    Highlights: • Floor flexibility plays a non-negligible role in amplifying horizontal vibrations. • COV of in-floor horizontal and vertical acceleration are 0.15–0.25 and 0.25–0.55. • In-floor variation of vibrations is higher in lower floors. • Floor spectra from limited nodes underestimates vibrations by a factor of 1.5–1.75. - Abstract: Floor vibration of a reactor building subjected to seismic loads was investigated, with the aim of quantifying the variability of vibrations on each floor. A detailed 3D building model founded on the bedrock was excited simultaneously in three directions by artificial accelerograms compatible with Finnish ground response spectra. Dynamic simulation for 21 s was carried out using explicit time integration. The extracted results of the simulation were acceleration in several floor locations, transformed to pseudo-acceleration (PSA) spectra in the next stage. At first, the monitored locations on the floors were estimated by engineering judgement in order to arrive at a feasible number of floor nodes for post processing of the data. It became apparent that engineering judgment was insufficient to depict the key locations with high floor vibrations, which resulted in un-conservative vibration estimates. For this reason, a more systematic approach was later considered, in which nodes of the floors were selected with a more refined grid of 2 m. With this method, in addition to the highest PSA peaks in all directions, the full vibration distribution in each floor can be determined. A statistical evaluation of the floor responses was also carried out in order to define floor accelerations and PSAs with high confidence of non-exceedance. The conclusion was that in-floor variability can be as high as 50–60% and models with sufficiently dense node grids should be used in order to achieve a realistic estimate of floor vibration under seismic action. The effects of the shape of the input spectra, damping, and flexibility of the

  13. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  14. Statistical approaches to assessing single and multiple outcome measures in dry eye therapy and diagnosis.

    Science.gov (United States)

    Tomlinson, Alan; Hair, Mario; McFadyen, Angus

    2013-10-01

    Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. Copyright © 2013. Published by Elsevier Inc.

  15. Transient Stability Assessment of Power System with Large Amount of Wind Power Penetration

    DEFF Research Database (Denmark)

    Liu, Leo; Chen, Zhe; Bak, Claus Leth

    2012-01-01

    Recently, the security and stability of power system with large amount of wind power are the concerned issues, especially the transient stability. In Denmark, the onshore and offshore wind farms are connected to distribution system and transmission system respectively. The control and protection...... methodologies of onshore and offshore wind farms definitely affect the transient stability of power system. In this paper, the onshore and offshore wind farms are modeled in detail in order to assess the transient stability of western Danish power system. Further, the computation of critical clearing time (CCT...... plants, load consumption level and high voltage direct current (HVDC) transmission links are taken into account. The results presented in this paper are able to provide an early awareness of power system security condition of the western Danish power system....

  16. Probabilistic safety assessments of nuclear power plants for low power and shutdown modes

    International Nuclear Information System (INIS)

    2000-03-01

    Within the past several years the results of nuclear power plant operating experience and performance of probabilistic safety assessments (PSAs) for low power and shutdown operating modes have revealed that the risk from operating modes other than full power may contribute significantly to the overall risk from plant operations. These early results have led to an increased focus on safety during low power and shutdown operating modes and to an increased interest of many plant operators in performing shutdown and low power PSAs. This publication was developed to provide guidance and insights on the performance of PSA for shutdown and low power operating modes. The preparation of this publication was initiated in 1994. Two technical consultants meetings were conducted in 1994 and one in February 1999 in support of the development of this report

  17. Design of durability test protocol for vehicular fuel cell systems operated in power-follow mode based on statistical results of on-road data

    Science.gov (United States)

    Xu, Liangfei; Reimer, Uwe; Li, Jianqiu; Huang, Haiyan; Hu, Zunyan; Jiang, Hongliang; Janßen, Holger; Ouyang, Minggao; Lehnert, Werner

    2018-02-01

    City buses using polymer electrolyte membrane (PEM) fuel cells are considered to be the most likely fuel cell vehicles to be commercialized in China. The technical specifications of the fuel cell systems (FCSs) these buses are equipped with will differ based on the powertrain configurations and vehicle control strategies, but can generally be classified into the power-follow and soft-run modes. Each mode imposes different levels of electrochemical stress on the fuel cells. Evaluating the aging behavior of fuel cell stacks under the conditions encountered in fuel cell buses requires new durability test protocols based on statistical results obtained during actual driving tests. In this study, we propose a systematic design method for fuel cell durability test protocols that correspond to the power-follow mode based on three parameters for different fuel cell load ranges. The powertrain configurations and control strategy are described herein, followed by a presentation of the statistical data for the duty cycles of FCSs in one city bus in the demonstration project. Assessment protocols are presented based on the statistical results using mathematical optimization methods, and are compared to existing protocols with respect to common factors, such as time at open circuit voltage and root-mean-square power.

  18. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  19. Data base of accident and agricultural statistics for transportation risk assessment

    International Nuclear Information System (INIS)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs

  20. What influences the choice of assessment methods in health technology assessments? Statistical analysis of international health technology assessments from 1989 to 2002.

    Science.gov (United States)

    Draborg, Eva; Andersen, Christian Kronborg

    2006-01-01

    Health technology assessment (HTA) has been used as input in decision making worldwide for more than 25 years. However, no uniform definition of HTA or agreement on assessment methods exists, leaving open the question of what influences the choice of assessment methods in HTAs. The objective of this study is to analyze statistically a possible relationship between methods of assessment used in practical HTAs, type of assessed technology, type of assessors, and year of publication. A sample of 433 HTAs published by eleven leading institutions or agencies in nine countries was reviewed and analyzed by multiple logistic regression. The study shows that outsourcing of HTA reports to external partners is associated with a higher likelihood of using assessment methods, such as meta-analysis, surveys, economic evaluations, and randomized controlled trials; and with a lower likelihood of using assessment methods, such as literature reviews and "other methods". The year of publication was statistically related to the inclusion of economic evaluations and shows a decreasing likelihood during the year span. The type of assessed technology was related to economic evaluations with a decreasing likelihood, to surveys, and to "other methods" with a decreasing likelihood when pharmaceuticals were the assessed type of technology. During the period from 1989 to 2002, no major developments in assessment methods used in practical HTAs were shown statistically in a sample of 433 HTAs worldwide. Outsourcing to external assessors has a statistically significant influence on choice of assessment methods.

  1. Distributed power generation, a market assessment; Marktaspekte der verteilten Energieerzeugung

    Energy Technology Data Exchange (ETDEWEB)

    Weller, T.

    2001-03-01

    The article assesses in the light of current energy policy the development of distributed power generation in the future, and resulting impacts on the structure the deregulated power industry in Germany. The author defines the essential characteristics of distributed power generation as opposed to centralized power generation, explains the various existing and emerging power generation technologies, and discusses market penetration scenarios and marketing opportunities in the context of technological developments, environmental and energy efficiency aspects, and consumer attitudes. (orig./CB) [German] Der Artikel bietet wichtige Definitionen fuer eine zielfuehrende Diskussion ueber das gesamte Gebiet der verteilten und dezentralen Energieerzeugung. Er versucht, teilweise emotional besetzte Themen auf sachlich begruendbare Grundannahmen zurueckzufuehren und zieht erste Folgerungen fuer das Zusammenwirken von erneuerbaren Energien und verteilter Energieerzeugung. (orig./CB)

  2. Seismic design of nuclear power plants - an assessment

    International Nuclear Information System (INIS)

    Howard, G.E.; Ibanez, P.; Smith, C.B.

    1976-01-01

    This paper presents a review and evaluation of the design standards and the analytical and experimental methods used in the seismic design of nuclear power plants with emphasis on United States practice. Three major areas were investigated: (a) soils, siting, and seismic ground motion specification; (b) soil-structure interaction; and (c) the response of major nuclear power plant structures and components. The purpose of this review and evaluation program was to prepare an independent assessment of the state-of-the-art of the seismic design of nuclear power plants and to identify seismic analysis and design research areas meriting support by the various organizations comprising the 'nuclear power industry'. Criteria used for evaluating the relative importance of alternative research areas included the potential research impact on nuclear power plant siting, design, construction, cost, safety, licensing, and regulation. (Auth.)

  3. Transient stability risk assessment of power systems incorporating wind farms

    DEFF Research Database (Denmark)

    Miao, Lu; Fang, Jiakun; Wen, Jinyu

    2013-01-01

    fed induction generator has been established. Wind penetration variation and multiple stochastic factors of power systems have been considered. The process of transient stability risk assessment based on the Monte Carlo method has been described and a comprehensive risk indicator has been proposed......Large-scale wind farm integration has brought several aspects of challenges to the transient stability of power systems. This paper focuses on the research of the transient stability of power systems incorporating with wind farms by utilizing risk assessment methods. The detailed model of double....... An investigation has been conducted into an improved 10-generator 39-bus system with a wind farm incorporated to verify the validity and feasibility of the risk assessment method proposed....

  4. Economic assessment group on power transmission and distribution networks tariffs

    International Nuclear Information System (INIS)

    2000-06-01

    Facing the new law on the electric power market liberalization, the french government created an experts group to analyze solutions and assessment methods of the electrical networks costs and tariffs and to control their efficiency. This report presents the analysis and the conclusions of the group. It concerns the three main subjects: the regulation context, the tariffing of the electric power transmission and distribution (the cost and efficiency of the various options) and the tariffing of the electric power supply to the eligible consumers. The authors provide a guideline for a tariffing policy. (A.L.B.)

  5. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.

    2015-01-01

    A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from...... the kernel matrix, called similarity index, which is introduced by the authors for the first time. The operation of the turbine and consequently the computation of the similarity indexes is classified into five power bins offering better resolution and thus more consistent root cause analysis. The accurate...

  6. Assessing survivability to support power grid investment decisions

    International Nuclear Information System (INIS)

    Koziolek, Anne; Avritzer, Alberto; Suresh, Sindhu; Menasché, Daniel S.; Diniz, Morganna; Souza e Silva, Edmundo de; Leão, Rosa M.; Trivedi, Kishor; Happe, Lucia

    2016-01-01

    The reliability of power grids has been subject of study for the past few decades. Traditionally, detailed models are used to assess how the system behaves after failures. Such models, based on power flow analysis and detailed simulations, yield accurate characterizations of the system under study. However, they fall short on scalability. In this paper, we propose an efficient and scalable approach to assess the survivability of power systems. Our approach takes into account the phased-recovery of the system after a failure occurs. The proposed phased-recovery model yields metrics such as the expected accumulated energy not supplied between failure and full recovery. Leveraging the predictive power of the model, we use it as part of an optimization framework to assist in investment decisions. Given a budget and an initial circuit to be upgraded, we propose heuristics to sample the solution space in a principled way accounting for survivability-related metrics. We have evaluated the feasibility of this approach by applying it to the design of a benchmark distribution automation circuit. Our empirical results indicate that the combination of survivability and power flow analysis can provide meaningful investment decision support for power systems engineers. - Highlights: • We propose metrics and models for scalable survivability analysis of power systems. • The survivability model captures the system phased-recovery, from failure to repair. • The survivability model is used as a building block of an optimization framework. • Heuristics assist in investment options accounting for survivability-related metrics.

  7. A Tsunami Fragility Assessment for Nuclear Power Plants in Korea

    International Nuclear Information System (INIS)

    Kim, Min Kyu; Choi, In Kil; Kang, Keum Seok

    2009-01-01

    Although Tsunami events were defined as an external event in 'PRA Procedure Guide (NUREG/CR- 2300)'after 1982, a Tsunami event was not considered in a design and construction of NPP before the Sumatra earthquake in 2004. But the Madras Atomic Power Station, a commercial nuclear power plant owned and operated by the Nuclear Power Corporation of India Limited (NPCIL), and located near Chennai, India, was affected by the tsunami generated by the 2004 Sumatra earthquake (USNRC 2008). The condenser cooling pumps of Unit 2 of the installation were affected due to flooding of the pump house and subsequent submergence of the seawater pumps by tsunami waves. The turbine was tripped and the reactor shut down. The unit was brought to a cold-shutdown state, and the shutdown-cooling systems were reported as operating safely. After this event, Tsunami hazards were considered as one of the major natural disasters which can affect the safety of Nuclear Power Plants. The IAEA performed an Extrabudgetary project for Tsunami Hazard Assessment and finally an International Seismic Safety Center (ISSC) established in IAEA for protection from natural disasters like earthquake, tsunami etc. For this reason, a tsunami hazard assessment method determined in this study. At first, a procedure for tsunami hazard assessment method was established, and second target equipment and structures for investigation of Tsunami Hazard assessment were selected. Finally, a sample fragility calculation was performed for one of equipment in Nuclear Power Plant

  8. [An investigation of the statistical power of the effect size in randomized controlled trials for the treatment of patients with type 2 diabetes mellitus using Chinese medicine].

    Science.gov (United States)

    Ma, Li-Xin; Liu, Jian-Ping

    2012-01-01

    To investigate whether the power of the effect size was based on adequate sample size in randomized controlled trials (RCTs) for the treatment of patients with type 2 diabetes mellitus (T2DM) using Chinese medicine. China Knowledge Resource Integrated Database (CNKI), VIP Database for Chinese Technical Periodicals (VIP), Chinese Biomedical Database (CBM), and Wangfang Data were systematically recruited using terms like "Xiaoke" or diabetes, Chinese herbal medicine, patent medicine, traditional Chinese medicine, randomized, controlled, blinded, and placebo-controlled. Limitation was set on the intervention course > or = 3 months in order to identify the information of outcome assessement and the sample size. Data collection forms were made according to the checking lists found in the CONSORT statement. Independent double data extractions were performed on all included trials. The statistical power of the effects size for each RCT study was assessed using sample size calculation equations. (1) A total of 207 RCTs were included, including 111 superiority trials and 96 non-inferiority trials. (2) Among the 111 superiority trials, fasting plasma glucose (FPG) and glycosylated hemoglobin HbA1c (HbA1c) outcome measure were reported in 9% and 12% of the RCTs respectively with the sample size > 150 in each trial. For the outcome of HbA1c, only 10% of the RCTs had more than 80% power. For FPG, 23% of the RCTs had more than 80% power. (3) In the 96 non-inferiority trials, the outcomes FPG and HbA1c were reported as 31% and 36% respectively. These RCTs had a samples size > 150. For HbA1c only 36% of the RCTs had more than 80% power. For FPG, only 27% of the studies had more than 80% power. The sample size for statistical analysis was distressingly low and most RCTs did not achieve 80% power. In order to obtain a sufficient statistic power, it is recommended that clinical trials should establish clear research objective and hypothesis first, and choose scientific and evidence

  9. Blind image quality assessment based on aesthetic and statistical quality-aware features

    Science.gov (United States)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  10. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  11. Sex differences in discriminative power of volleyball game-related statistics.

    Science.gov (United States)

    João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime

    2010-12-01

    To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.

  12. Statistical annual report 2008 - Furnas Electric Power Plants Inc. - Calendar year 2007

    International Nuclear Information System (INIS)

    2008-01-01

    This 30th edition of the statistical annual report of Furnas reports the performance of the company in 2007 and recent years allowing a general view on: Furnas system; production and supply; financial and economic data, personnel and indicators

  13. Building damage assessment from PolSAR data using texture parameters of statistical model

    Science.gov (United States)

    Li, Linlin; Liu, Xiuguo; Chen, Qihao; Yang, Shuai

    2018-04-01

    Accurate building damage assessment is essential in providing decision support for disaster relief and reconstruction. Polarimetric synthetic aperture radar (PolSAR) has become one of the most effective means of building damage assessment, due to its all-day/all-weather ability and richer backscatter information of targets. However, intact buildings that are not parallel to the SAR flight pass (termed oriented buildings) and collapsed buildings share similar scattering mechanisms, both of which are dominated by volume scattering. This characteristic always leads to misjudgments between assessments of collapsed buildings and oriented buildings from PolSAR data. Because the collapsed buildings and the intact buildings (whether oriented or parallel buildings) have different textures, a novel building damage assessment method is proposed in this study to address this problem by introducing texture parameters of statistical models. First, the logarithms of the estimated texture parameters of different statistical models are taken as a new texture feature to describe the collapse of the buildings. Second, the collapsed buildings and intact buildings are distinguished using an appropriate threshold. Then, the building blocks are classified into three levels based on the building block collapse rate. Moreover, this paper also discusses the capability for performing damage assessment using texture parameters from different statistical models or using different estimators. The RADARSAT-2 and ALOS-1 PolSAR images are used to present and analyze the performance of the proposed method. The results show that using the texture parameters avoids the problem of confusing collapsed and oriented buildings and improves the assessment accuracy. The results assessed by using the K/G0 distribution texture parameters estimated based on the second moment obtain the highest extraction accuracies. For the RADARSAT-2 and ALOS-1 data, the overall accuracy (OA) for these three types of

  14. Mathematics authentic assessment on statistics learning: the case for student mini projects

    Science.gov (United States)

    Fauziah, D.; Mardiyana; Saputro, D. R. S.

    2018-03-01

    Mathematics authentic assessment is a form of meaningful measurement of student learning outcomes for the sphere of attitude, skill and knowledge in mathematics. The construction of attitude, skill and knowledge achieved through the fulfilment of tasks which involve active and creative role of the students. One type of authentic assessment is student mini projects, started from planning, data collecting, organizing, processing, analysing and presenting the data. The purpose of this research is to learn the process of using authentic assessments on statistics learning which is conducted by teachers and to discuss specifically the use of mini projects to improving students’ learning in the school of Surakarta. This research is an action research, where the data collected through the results of the assessments rubric of student mini projects. The result of data analysis shows that the average score of rubric of student mini projects result is 82 with 96% classical completeness. This study shows that the application of authentic assessment can improve students’ mathematics learning outcomes. Findings showed that teachers and students participate actively during teaching and learning process, both inside and outside of the school. Student mini projects also provide opportunities to interact with other people in the real context while collecting information and giving presentation to the community. Additionally, students are able to exceed more on the process of statistics learning using authentic assessment.

  15. Stochastic assessment of investment efficiency in a power system

    International Nuclear Information System (INIS)

    Davidov, Sreten; Pantoš, Miloš

    2017-01-01

    The assessment of investment efficiency plays a critical role in investment prioritization in the context of electrical network expansion planning. Hence, this paper proposes new criteria for the cost-efficiency investment applied in the investment ranking process in electrical network planning, based on the assessment of the new investment candidates impact on active-power losses, bus voltages and line loadings in the network. These three general criteria are chosen due to their strong economic influence when the active-power losses and line loadings are considered and due to their significant impact on quality of supply allowed for the voltage profile. Electrical network reliability of supply is not addressed, since, this criterion has already been extensively applied in other solutions regarding investment efficiency assessment. The proposed ranking procedure involves a stochastic approach applying the Monte Carlo method in the scenario preparation. The number of scenarios is further reduced by the K-MEANS procedure in order to speed up the investment efficiency assessment. The proposed ranking procedure is tested using the standard New England test system. The results show that based on the newly involved investment assessment criteria indices, system operators will obtain a prioritized list of investments that will prevent excessive and economically wasteful spending. - Highlights: • Active-Power Loss Investment Efficiency Index LEI. • Voltage Profile Investment Efficiency Index VEI. • Active-Power Flow Loading Mitigation Investment Efficiency Index PEI. • Optimization model for network expansion planning with new indices.

  16. Wind power planning: assessing long-term costs and benefits

    International Nuclear Information System (INIS)

    Kennedy, Scott

    2005-01-01

    In the following paper, a new and straightforward technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic load duration curves to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. The model is applied to potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO 2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on CO 2 charges, and capital costs for wind turbines and IGCC plant is also discussed. The methodology is intended for use by energy planners in assessing the social benefit of future investments in wind power

  17. The mediation of environmental assessment's influence: What role for power?

    International Nuclear Information System (INIS)

    Cashmore, Matthew; Axelsson, Anna

    2013-01-01

    Considerable empirical research has been conducted on why policy tools such as environmental assessment (EA) often appear to have ‘little effect’ (after Weiss) on policy decisions. This article revisits this debate but looks at a mediating factor that has received limited attention to-date in the context of EA — political power. Using a tripartite analytical framework, a comparative analysis of the influence and significance of power in mediating environmental policy integration is undertaken. Power is analysed, albeit partially, through an exploration of institutions that underpin social order. Empirically, the research examines the case of a new approach to policy-level EA (essentially a form of Strategic Environmental Assessment) developed by the World Bank and its trial application to urban environmental governance and planning in Dhaka mega-city, Bangladesh. The research results demonstrate that power was intimately involved in mediating the influence of the policy EA approach, in both positive (enabling) and negative (constraining) ways. It is suggested that the policy EA approach was ultimately a manifestation of a corporate strategy to maintain the powerful position of the World Bank as a leading authority on international development which focuses on knowledge generation. Furthermore, as constitutive of an institution and reflecting the worldviews of its proponents, the development of a new approach to EA also represents a significant power play. This leads us to, firstly, emphasise the concepts of strategy and intentionality in theorising how and why EA tools are employed, succeed and fail; and secondly, reflect on the reasons why power has received such limited attention to-date in EA scholarship. - Highlights: ► Conducts empirical research on the neglected issue of power. ► Employs an interpretation of power in which it is viewed as a productive phenomenon. ► Analyses the influence of power in the trial application of a new approach to

  18. Statistical assessment of crosstalk enrichment between gene groups in biological networks.

    Science.gov (United States)

    McCormack, Theodore; Frings, Oliver; Alexeyenko, Andrey; Sonnhammer, Erik L L

    2013-01-01

    Analyzing groups of functionally coupled genes or proteins in the context of global interaction networks has become an important aspect of bioinformatic investigations. Assessing the statistical significance of crosstalk enrichment between or within groups of genes can be a valuable tool for functional annotation of experimental gene sets. Here we present CrossTalkZ, a statistical method and software to assess the significance of crosstalk enrichment between pairs of gene or protein groups in large biological networks. We demonstrate that the standard z-score is generally an appropriate and unbiased statistic. We further evaluate the ability of four different methods to reliably recover crosstalk within known biological pathways. We conclude that the methods preserving the second-order topological network properties perform best. Finally, we show how CrossTalkZ can be used to annotate experimental gene sets using known pathway annotations and that its performance at this task is superior to gene enrichment analysis (GEA). CrossTalkZ (available at http://sonnhammer.sbc.su.se/download/software/CrossTalkZ/) is implemented in C++, easy to use, fast, accepts various input file formats, and produces a number of statistics. These include z-score, p-value, false discovery rate, and a test of normality for the null distributions.

  19. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  20. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    Science.gov (United States)

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression

  1. No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.

    Science.gov (United States)

    Li, Xuelong; Guo, Qun; Lu, Xiaoqiang

    2016-05-13

    It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.

  2. A framework for evaluating innovative statistical and risk assessment tools to solve environment restoration problems

    International Nuclear Information System (INIS)

    Hassig, N.L.; Gilbert, R.O.; Pulsipher, B.A.

    1991-09-01

    Environmental restoration activities at the US Department of Energy (DOE) Hanford site face complex issues due to history of varied past contaminant disposal practices. Data collection and analysis required for site characterization, pathway modeling, and remediation selection decisions must deal with inherent uncertainties and unique problems associated with the restoration. A framework for working through the statistical aspects of the site characterization and remediation selection problems is needed. This framework would facilitate the selection of appropriate statistical tools for solving unique aspects of the environmental restoration problem. This paper presents a framework for selecting appropriate statistical and risk assessment methods. The following points will be made: (1) pathway modelers and risk assessors often recognize that ''some type'' of statistical methods are required but don't work with statisticians on tools development in the early planning phases of the project; (2) statistical tools selection and development are problem-specific and often site-specific, further indicating a need for up-front involvement of statisticians; and (3) the right tool, applied in the right way can minimize sampling costs, get as much information as possible out of the data that does exist, provide consistency and defensibility for the results, and given structure and quantitative measures to decision risks and uncertainties

  3. A multivariate statistical methodology for detection of degradation and failure trends using nuclear power plant operational data

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.

    1990-01-01

    In this paper, a multivariate statistical method is presented and demonstrated as a means for analyzing nuclear power plant transients (or events) and safety system performance for detection of malfunctions and degradations within the course of the event based on operational data. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to detect failure trends and patterns and so can lead to prevention of conditions with serious safety implications

  4. Blinking in quantum dots: The origin of the grey state and power law statistics

    Science.gov (United States)

    Ye, Mao; Searson, Peter C.

    2011-09-01

    Quantum dot (QD) blinking is characterized by switching between an “on” state and an “off” state, and a power-law distribution of on and off times with exponents from 1.0 to 2.0. The origin of blinking behavior in QDs, however, has remained a mystery. Here we describe an energy-band model for QDs that captures the full range of blinking behavior reported in the literature and provides new insight into features such as the gray state, the power-law distribution of on and off times, and the power-law exponents.

  5. Statistical power to detect change in a mangrove shoreline fish community adjacent to a nuclear power plant.

    Science.gov (United States)

    Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E

    2016-03-01

    An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.

  6. Safety assessment of emergency electric power systems for nuclear power plants

    International Nuclear Information System (INIS)

    1986-09-01

    This paper is intended to assist the safety assessor within a regulatory body, or one working as a consultant, in assessing a given design of the Emergency Electrical Power System. Those non-electric power systems which may be used in a plant design to serve as emergency energy sources are addressed only in their general safety aspects. The paper thus relates closely to Safety Series 50-SG-D7 ''Emergency Power Systems at Nuclear Power Plants'' (1982), as far as it addresses emergency electric power systems. Several aspects are dealt with: the information the assessor may expect from the applicant to fulfill his task of safety review; the main questions the reviewer has to answer in order to determine the compliance with requirements of the NUSS documents; the national or international standards which give further guidance on a certain system or piece of equipment; comments and suggestions which may help to judge a variety of possible solutions

  7. The N-Pact Factor: Evaluating the Quality of Empirical Journals with Respect to Sample Size and Statistical Power

    Science.gov (United States)

    Fraley, R. Chris; Vazire, Simine

    2014-01-01

    The authors evaluate the quality of research reported in major journals in social-personality psychology by ranking those journals with respect to their N-pact Factors (NF)—the statistical power of the empirical studies they publish to detect typical effect sizes. Power is a particularly important attribute for evaluating research quality because, relative to studies that have low power, studies that have high power are more likely to (a) to provide accurate estimates of effects, (b) to produce literatures with low false positive rates, and (c) to lead to replicable findings. The authors show that the average sample size in social-personality research is 104 and that the power to detect the typical effect size in the field is approximately 50%. Moreover, they show that there is considerable variation among journals in sample sizes and power of the studies they publish, with some journals consistently publishing higher power studies than others. The authors hope that these rankings will be of use to authors who are choosing where to submit their best work, provide hiring and promotion committees with a superior way of quantifying journal quality, and encourage competition among journals to improve their NF rankings. PMID:25296159

  8. Statistical Assessment of the Effectiveness of Transformation Change (by Case of Singapore

    Directory of Open Access Journals (Sweden)

    Zhuravlyov

    2017-02-01

    Full Text Available In studies of economic transformations and their statistical assessment, the causality of processes specific to economic relations and development of institutions is overlooked. The article is devoted to the important topic of statistical assessment of the transformations effectiveness. The case of Singapore is taken because it is an Asian country demonstrating the essential role of the institutional environment in the national economy transformations. The regression analysis of the impact of institutional factors on economic growth in Singapore is made using 17 indicators: civil freedoms, corruption, economic freedom, economic globalization, spending on education, use of energy, share of women at labor market, fiscal freedom, price for fuel, PPP, effectiveness of public administration, level of consumption, Human Development Index, Internet users, life expectancy, unemployment, openness of trade. Economic interpretation of the statistical assessment of economic transformations in Singapore is as follows: quality of the institutional environment (control of corruption, economic freedom, supremacy of law etc. has critical importance for economic development in Singapore; the increasing spending on education has positive effects for economic growth in Singapore; economic growth in Singapore has high positive correlation with energy consumption.

  9. Power-law Statistics of Driven Reconnection in the Magnetically Closed Corona

    Science.gov (United States)

    Knizhnik, K. J.; Uritsky, V. M.; Klimchuk, J. A.; DeVore, C. R.

    2018-01-01

    Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.

  10. Power-Law Statistics of Driven Reconnection in the Magnetically Closed Corona

    Science.gov (United States)

    Klimchuk, J. A.; DeVore, C. R.; Knizhnik, K. J.; Uritskiy, V. M.

    2018-01-01

    Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.

  11. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    Directory of Open Access Journals (Sweden)

    Amzal Billy

    2011-02-01

    Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.

  12. Preliminary regulatory assessment of nuclear power plants vulnerabilities

    International Nuclear Information System (INIS)

    Kostadinov, V.; Petelin, S.

    2004-01-01

    Preliminary attempts to develop models for nuclear regulatory vulnerability assessment of nuclear power plants are presented. Development of the philosophy and computer tools could be new and important insight for management of nuclear operators and nuclear regulatory bodies who face difficult questions about how to assess the vulnerability of nuclear power plants and other nuclear facilities to external and internal threats. In the situation where different and hidden threat sources are dispersed throughout the world, the assessment of security and safe operation of nuclear power plants is very important. Capability to evaluate plant vulnerability to different kinds of threats, like human and natural occurrences and terrorist attacks and preparation of emergency response plans and estimation of costs are of vital importance for assurance of national security. On the basis of such vital insights, nuclear operators and nuclear regulatory bodies could plan and optimise changes in oversight procedures, organisations, equipment, hardware and software to reduce risks taking into account security and safety of nuclear power plants operation, budget, manpower, and other limitations. Initial qualitative estimations of adapted assessments for nuclear applications are shortly presented. (author)

  13. Risk assessment and the social response to nuclear power

    International Nuclear Information System (INIS)

    Otway, H.J.

    1977-01-01

    A theoretical framework for risk assessment studies is presented. Methodologies from various disciplines can be used within this framework to allow a scientific approach to the understanding of complex interactions between technological and social systems. A pilot application of an attitude-formation model to examine the underlying determinants of groups for and against nuclear power is summarized. (author)

  14. Muscle blood volume assessment during exercise with Power Doppler Ultrasound

    NARCIS (Netherlands)

    Heres, H.M.; Tchang, B.C.Y.; Schoots, T.; Rutten, M.C.M.; van de Vosse, F.N.; Lopata, R.G.P.

    2016-01-01

    Assessment of perfusion adaptation in muscle during exercise can provide diagnostic information on cardiac and endothelial diseases. Power Doppler Ultrasound (PDUS) is known for its feasibility in the non-invasive measurement of moving blood volume (MBV), a perfusion related parameter. In this

  15. Shielding assessment of the ETRR-1 Reactor Under power upgrading

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, E E [Reactor Department, Nuclear Research Center, Atomic Energy Authority, Cairo (Egypt)

    1997-12-31

    The assessment of existing shielding of the ETRR-1 reactor in case of power upgrading is presented and discussed. It was carried out using both the present EK-10 type fuel elements and some other types of fuel elements with different enrichments. The shielding requirements for the ETRR-1 when power is upgraded are also discussed. The optimization curves between the upgraded reactor power and the shield thickness are presented. The calculation have been made using the ANISN code with the DLC-75 data library. The results showed that the present shield necessitates an additional layer of steel with thickness of 10.20 and 25 cm. When its power is upgraded to 3, 6 and 10 MWt in order to cutoff all neutron energy groups to be adequately safe under normal operating conditions. 4 figs.

  16. Life Cycle Assessment of Coal-fired Power Production

    Energy Technology Data Exchange (ETDEWEB)

    Spath, P. L.; Mann, M. K.; Kerr, D. R.

    1999-09-01

    Coal has the largest share of utility power generation in the US, accounting for approximately 56% of all utility-produced electricity (US DOE, 1998). Therefore, understanding the environmental implications of producing electricity from coal is an important component of any plan to reduce total emissions and resource consumption. A life cycle assessment (LCA) on the production of electricity from coal was performed in order to examine the environmental aspects of current and future pulverized coal boiler systems. Three systems were examined: (1) a plant that represents the average emissions and efficiency of currently operating coal-fired power plants in the US (this tells us about the status quo), (2) a new coal-fired power plant that meets the New Source Performance Standards (NSPS), and (3) a highly advanced coal-fired power plant utilizing a low emission boiler system (LEBS).

  17. The significance of structural power in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Hansen, Anne Merrild; Kørnøv, Lone; Cashmore, Matthew Asa

    2013-01-01

    , that actors influence both outcome and frames for strategic decision making and attention needs to be on not only the formal interactions between SEA process and strategic decision-making process but also on informal interaction and communication between actors. The informal structures shows crucial...... to the outcome of the decision-making process. The article is meant as a supplement to the understanding of power dynamics influence in IA processes emphasising the capacity of agents to mobilise and create change. Despite epistemological challenges of using ST theory as an approach to power analysis, this meta......This article presents a study of how power dynamics enables and constrains the influence of actors upon decision-making and Strategic Environmental Assessment (SEA). Based on Anthony Giddens structuration theory (ST), a model for studying power dynamics in strategic decision-making processes...

  18. Craig's XY distribution and the statistics of Lagrangian power in two-dimensional turbulence

    Science.gov (United States)

    Bandi, Mahesh M.; Connaughton, Colm

    2008-03-01

    We examine the probability distribution function (PDF) of the energy injection rate (power) in numerical simulations of stationary two-dimensional (2D) turbulence in the Lagrangian frame. The simulation is designed to mimic an electromagnetically driven fluid layer, a well-documented system for generating 2D turbulence in the laboratory. In our simulations, the forcing and velocity fields are close to Gaussian. On the other hand, the measured PDF of injected power is very sharply peaked at zero, suggestive of a singularity there, with tails which are exponential but asymmetric. Large positive fluctuations are more probable than large negative fluctuations. It is this asymmetry of the tails which leads to a net positive mean value for the energy input despite the most probable value being zero. The main features of the power distribution are well described by Craig’s XY distribution for the PDF of the product of two correlated normal variables. We show that the power distribution should exhibit a logarithmic singularity at zero and decay exponentially for large absolute values of the power. We calculate the asymptotic behavior and express the asymmetry of the tails in terms of the correlation coefficient of the force and velocity. We compare the measured PDFs with the theoretical calculations and briefly discuss how the power PDF might change with other forcing mechanisms.

  19. Power and environmental assessment: Introduction to the special issue

    International Nuclear Information System (INIS)

    Cashmore, Matthew; Richardson, Tim

    2013-01-01

    The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently with the issue of power. The introduction represents a ground-clearing exercise intended to clarify the terms of the debate about power in the EA field, and to contribute to the development of a research agenda. Research trends in the field are outlined, and potential analytic and normative lines of inquiry are identified. The contributions to this special issue represent contrasting conceptual and methodological approaches that navigate the analytical and normative terrain of power dynamics in EA. Together, they demonstrate that power cannot be removed from EA policy or practices, and is a necessary research focus for the development of the field. - Highlights: ► Introduces the themed section on power ► Provides an overview of the papers in the themed section ► Identifies research trends and directions for future research

  20. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    Science.gov (United States)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  1. Statistical analysis of fuel failures in large break loss-of-coolant accident (LBLOCA) in EPR type nuclear power plant

    International Nuclear Information System (INIS)

    Arkoma, Asko; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta

    2015-01-01

    Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)

  2. Statistical analysis of fuel failures in large break loss-of-coolant accident (LBLOCA) in EPR type nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta

    2015-04-15

    Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)

  3. The intermediates take it all: asymptotics of higher criticism statistics and a powerful alternative based on equal local levels.

    Science.gov (United States)

    Gontscharuk, Veronika; Landwehr, Sandra; Finner, Helmut

    2015-01-01

    The higher criticism (HC) statistic, which can be seen as a normalized version of the famous Kolmogorov-Smirnov statistic, has a long history, dating back to the mid seventies. Originally, HC statistics were used in connection with goodness of fit (GOF) tests but they recently gained some attention in the context of testing the global null hypothesis in high dimensional data. The continuing interest for HC seems to be inspired by a series of nice asymptotic properties related to this statistic. For example, unlike Kolmogorov-Smirnov tests, GOF tests based on the HC statistic are known to be asymptotically sensitive in the moderate tails, hence it is favorably applied for detecting the presence of signals in sparse mixture models. However, some questions around the asymptotic behavior of the HC statistic are still open. We focus on two of them, namely, why a specific intermediate range is crucial for GOF tests based on the HC statistic and why the convergence of the HC distribution to the limiting one is extremely slow. Moreover, the inconsistency in the asymptotic and finite behavior of the HC statistic prompts us to provide a new HC test that has better finite properties than the original HC test while showing the same asymptotics. This test is motivated by the asymptotic behavior of the so-called local levels related to the original HC test. By means of numerical calculations and simulations we show that the new HC test is typically more powerful than the original HC test in normal mixture models. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Steady state security assessment in deregulated power systems

    Science.gov (United States)

    Manjure, Durgesh Padmakar

    Power system operations are undergoing changes, brought about primarily due to deregulation and subsequent restructuring of the power industry. The primary intention of the introduction of deregulation in power systems was to bring about competition and improved customer focus. The underlying motive was increased economic benefit. Present day power system analysis is much different than what it was earlier, essentially due to the transformation of the power industry from being cost-based to one that is price-based and due to open access of transmission networks to the various market participants. Power is now treated as a commodity and is traded in an open market. The resultant interdependence of the technical criteria and the economic considerations has only accentuated the need for accurate analysis in power systems. The main impetus in security analysis studies is on efficient assessment of the post-contingency status of the system, accuracy being of secondary consideration. In most cases, given the time frame involved, it is not feasible to run a complete AC load flow for determining the post-contingency state of the system. Quite often, it is not warranted as well, as an indication of the state of the system is desired rather than the exact quantification of the various state variables. With the inception of deregulation, transmission networks are subjected to a host of multilateral transactions, which would influence physical system quantities like real power flows, security margins and voltage levels. For efficient asset utilization and maximization of the revenue, more often than not, transmission networks are operated under stressed conditions, close to security limits. Therefore, a quantitative assessment of the extent to which each transaction adversely affects the transmission network is required. This needs to be done accurately as the feasibility of the power transactions and subsequent decisions (execution, curtailment, pricing) would depend upon the

  5. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    Science.gov (United States)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  6. Safety/security interface assessments at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Byers, K.R.; Brown, P.J.; Norderhaug, L.R.

    1985-01-01

    The findings of the Haynes Task Force Committee (NUREG-0992) are used as the basis for defining safety/security assessment team activities at commercial nuclear power plants in NRC Region V. A safety/security interface assessment outline and the approach used for making the assessments are presented along with the composition of team members. As a result of observing simulated plant emergency conditions during scheduled emergency preparedness exercises, examining security and operational response procedures, and interviewing plant personnel, the team has identified instances where safety/security conflicts can occur

  7. Safety/security interface assessments at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Byers, K.R.; Brown, P.J.; Norderhaug, L.R.

    1985-07-01

    The findings of the Haynes Task Force Committee (NUREG-0992) are used as the basis for defining safety/security assessment team activities at commercial nuclear power plants in NRC Region V. A safety/security interface assessment outline and the approach used for making the assessments are presented along with the composition of team members. As a result of observing simulated plant emergency conditions during scheduled emergency preparedness exercises, examining security and operational response procedures, and interviewing plant personnel, the team has identified instances where safety/security conflicts can occur. 2 refs

  8. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    Science.gov (United States)

    Porter, Kristin E.

    2016-01-01

    In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…

  9. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    Science.gov (United States)

    Porter, Kristin E.

    2018-01-01

    Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…

  10. A statistical power analysis of woody carbon flux from forest inventory data

    Science.gov (United States)

    James A. Westfall; Christopher W. Woodall; Mark A. Hatfield

    2013-01-01

    At a national scale, the carbon (C) balance of numerous forest ecosystem C pools can be monitored using a stock change approach based on national forest inventory data. Given the potential influence of disturbance events and/or climate change processes, the statistical detection of changes in forest C stocks is paramount to maintaining the net sequestration status of...

  11. Statistical analysis on the fluence factor of surveillance test data of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gyeong Geun; Kim, Min Chul; Yoon, Ji Hyun; Lee, Bong Sang; Lim, Sang Yeob; Kwon, Jun Hyun [Nuclear Materials Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    The transition temperature shift (TTS) of the reactor pressure vessel materials is an important factor that determines the lifetime of a nuclear power plant. The prediction of the TTS at the end of a plant’s lifespan is calculated based on the equation of Regulatory Guide 1.99 revision 2 (RG1.99/2) from the US. The fluence factor in the equation was expressed as a power function, and the exponent value was determined by the early surveillance data in the US. Recently, an advanced approach to estimate the TTS was proposed in various countries for nuclear power plants, and Korea is considering the development of a new TTS model. In this study, the TTS trend of the Korean surveillance test results was analyzed using a nonlinear regression model and a mixed-effect model based on the power function. The nonlinear regression model yielded a similar exponent as the power function in the fluence compared with RG1.99/2. The mixed-effect model had a higher value of the exponent and showed superior goodness of fit compared with the nonlinear regression model. Compared with RG1.99/2 and RG1.99/3, the mixed-effect model provided a more accurate prediction of the TTS.

  12. Complementary assessment of the safety of French nuclear power plants

    International Nuclear Information System (INIS)

    Camarcat, N.; Pouget-Abadie, X.

    2011-01-01

    As an immediate consequence of the Fukushima accident the French nuclear safety Authority (ASN) asked EDF to perform a complementary safety assessment for each nuclear power plant dealing with 3 points: 1) the consequences of exceptional natural disasters, 2) the consequences of total loss of electrical power, and 3) the management of emergency situations. The safety margin has to be assessed considering 3 main points: first a review of the conformity to the initial safety requirements, secondly the resistance to events overdoing what the facility was designed to stand for, and the feasibility of any modification susceptible to improve the safety of the facility. This article details the specifications of such assessment, the methodology followed by EDF, the task organization and the time schedule. (A.C.)

  13. Rainfall Downscaling Conditional on Upper-air Variables: Assessing Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Deidda, Roberto; Marrocu, Marino; Kaleris, Vassilios

    2014-05-01

    Due to its intermittent and highly variable character, and the modeling parameterizations used, precipitation is one of the least well reproduced hydrologic variables by both Global Climate Models (GCMs) and Regional Climate Models (RCMs). This is especially the case at a regional level (where hydrologic risks are assessed) and at small temporal scales (e.g. daily) used to run hydrologic models. In an effort to remedy those shortcomings and assess the effect of climate change on rainfall statistics at hydrologically relevant scales, Langousis and Kaleris (2013) developed a statistical framework for simulation of daily rainfall intensities conditional on upper air variables. The developed downscaling scheme was tested using atmospheric data from the ERA-Interim archive (http://www.ecmwf.int/research/era/do/get/index), and daily rainfall measurements from western Greece, and was proved capable of reproducing several statistical properties of actual rainfall records, at both annual and seasonal levels. This was done solely by conditioning rainfall simulation on a vector of atmospheric predictors, properly selected to reflect the relative influence of upper-air variables on ground-level rainfall statistics. In this study, we apply the developed framework for conditional rainfall simulation using atmospheric data from different GCM/RCM combinations. This is done using atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com), and daily rainfall measurements for an intermediate-sized catchment in Italy; i.e. the Flumendosa catchment. Since GCM/RCM products are suited to reproduce the local climatology in a statistical sense (i.e. in terms of relative frequencies), rather than ensuring a one-to-one temporal correspondence between observed and simulated fields (i.e. as is the case for ERA-interim reanalysis data), we proceed in three steps: a) we use statistical tools to establish a linkage between ERA-Interim upper-air atmospheric forecasts and

  14. The significance of structural power in Strategic Environmental Assessment

    International Nuclear Information System (INIS)

    Hansen, Anne Merrild; Kørnøv, Lone; Cashmore, Matthew; Richardson, Tim

    2013-01-01

    This article presents a study of how power dynamics enables and constrains the influence of actors upon decision-making and Strategic Environmental Assessment (SEA). Based on structuration theory, a model for studying power dynamics in strategic decision-making processes is developed. The model is used to map and analyse key decision arenas in the decision process of aluminium production in Greenland. The analysis shows that communication lines are an important resource through which actors exercise power and influence decision-making on the location of the aluminium production. The SEA process involved not only reproduction of formal communication and decision competence but also production of alternative informal communication structures in which the SEA had capability to influence. It is concluded, that actors influence strategic decision making, and attention needs to be on not only the formal interactions between SEA process and strategic decision-making process but also on informal interaction and communication between actors as the informal structures, which can be crucial to the outcome of the decision-making process. This article is meant as a supplement to the understanding of power dynamics influence in IA processes and as a contribution to the IA research field with a method to analyse power dynamics in strategic decision-making processes. The article also brings reflections of strengths and weaknesses of using the structuration theory as an approach to power analysis. - Highlights: ► Informal interaction influenced process despite the presence of formalised rules. ► Interdependence of actors influenced SEA effectiveness. ► SEA practitioners successfully exercised power to influence decision-making. ► Power dynamics are properties of actors' interactions in decision-making. ► Power structures can be enabling and not solely limiting.

  15. Students’ perception of frequent assessments and its relation to motivation and grades in a statistics course: a pilot study

    NARCIS (Netherlands)

    Vaessen, B.E.; van den Beemt, A.A.J.; van de Watering, G.A.; van Meeuwen, L.W.; Lemmens, A.M.C.; den Brok, P.J.

    2017-01-01

    This pilot study measures university students’ perceptions of graded frequent assessments in an obligatory statistics course using a novel questionnaire. Relations between perceptions of frequent assessments, intrinsic motivation and grades were also investigated. A factor analysis of the

  16. Simulating European wind power generation applying statistical downscaling to reanalysis data

    DEFF Research Database (Denmark)

    Gonzalez-Aparicio, I.; Monforti, F.; Volker, Patrick

    2017-01-01

    generation time series dataset for the EU-28 and neighbouring countries at hourly intervals and at different geographical aggregation levels (country, bidding zone and administrative territorial unit), for a 30 year period taking into account the wind generating fleet at the end of 2015. (C) 2017 The Authors...... and characteristics of the wind resource which is related to the accuracy of the approach in converting wind speed data into power values. One of the main factors contributing to the uncertainty in these conversion methods is the selection of the spatial resolution. Although numerical weather prediction models can...... could not be captured by the use of a reanalysis technique and could be translated into misinterpretations of the wind power peaks, ramping capacities, the behaviour of power prices, as well as bidding strategies for the electricity market. This study contributes to the understanding what is captured...

  17. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  18. Revised assessments of the economics of fusion power

    International Nuclear Information System (INIS)

    Han, W.E.; Ward, D.J.

    2009-01-01

    Although fusion power is being developed because of its large resource base, low environmental impact and high levels of intrinsic safety, it is also important to investigate the economics of a future fusion power plant in order to assess the potential market for the electricity produced. As part of the PPCS (Power Plant Conceptual Study) in Europe, published in 2005, an assessment was made of the likely economic performance of the range of fusion power plant concepts studied. Since that time, new work has been carried out, within the fusion programme, and particularly in the EU DEMO study, that changes a number of the important assumptions made in the PPCS. These changes allow either reduced cost versions of the PPCS plant models or, alternatively, plants with less ambitious technical assumptions at constant cost. The impact of the new results, emerging from the EU DEMO studies, on the role of fusion in the future energy market is described. A new energy economics model is employed to analyse the potential market performance of fusion power in a range of future energy scenarios and this shows that there can be a significant role for fusion in a future energy market.

  19. Statistical assessment of fish behavior from split-beam hydro-acoustic sampling

    International Nuclear Information System (INIS)

    McKinstry, Craig A.; Simmons, Mary Ann; Simmons, Carver S.; Johnson, Robert L.

    2005-01-01

    Statistical methods are presented for using echo-traces from split-beam hydro-acoustic sampling to assess fish behavior in response to a stimulus. The data presented are from a study designed to assess the response of free-ranging, lake-resident fish, primarily kokanee (Oncorhynchus nerka) and rainbow trout (Oncorhynchus mykiss) to high intensity strobe lights, and was conducted at Grand Coulee Dam on the Columbia River in Northern Washington State. The lights were deployed immediately upstream from the turbine intakes, in a region exposed to daily alternating periods of high and low flows. The study design included five down-looking split-beam transducers positioned in a line at incremental distances upstream from the strobe lights, and treatments applied in randomized pseudo-replicate blocks. Statistical methods included the use of odds-ratios from fitted loglinear models. Fish-track velocity vectors were modeled using circular probability distributions. Both analyses are depicted graphically. Study results suggest large increases of fish activity in the presence of the strobe lights, most notably at night and during periods of low flow. The lights also induced notable bimodality in the angular distributions of the fish track velocity vectors. Statistical/SUMmaries are presented along with interpretations on fish behavior

  20. Mathematical Safety Assessment Approaches for Thermal Power Plants

    Directory of Open Access Journals (Sweden)

    Zong-Xiao Yang

    2014-01-01

    Full Text Available How to use system analysis methods to identify the hazards in the industrialized process, working environment, and production management for complex industrial processes, such as thermal power plants, is one of the challenges in the systems engineering. A mathematical system safety assessment model is proposed for thermal power plants in this paper by integrating fuzzy analytical hierarchy process, set pair analysis, and system functionality analysis. In the basis of those, the key factors influencing the thermal power plant safety are analyzed. The influence factors are determined based on fuzzy analytical hierarchy process. The connection degree among the factors is obtained by set pair analysis. The system safety preponderant function is constructed through system functionality analysis for inherence properties and nonlinear influence. The decision analysis system is developed by using active server page technology, web resource integration, and cross-platform capabilities for applications to the industrialized process. The availability of proposed safety assessment approach is verified by using an actual thermal power plant, which has improved the enforceability and predictability in enterprise safety assessment.

  1. Power of mental health nursing research: a statistical analysis of studies in the International Journal of Mental Health Nursing.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2013-02-01

    Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.

  2. Sustainability assessment of renewable power and heat generation technologies

    International Nuclear Information System (INIS)

    Dombi, Mihály; Kuti, István; Balogh, Péter

    2014-01-01

    Rationalisation of consumption, more efficient energy usage and a new energy structure are needed to be achieved in order to shift the structure of energy system towards sustainability. The required energy system is among others characterised by intensive utilisation of renewable energy sources (RES). RES technologies have their own advantages and disadvantages. Nevertheless, for the strategic planning there is a great demand for the comparison of RES technologies. Furthermore, there are additional functions of RES utilisation expected beyond climate change mitigation, e.g. increment of employment, economic growth and rural development. The aim of the study was to reveal the most beneficial RES technologies with special respect to sustainability. Ten technologies of power generation and seven technologies of heat supply were examined in a multi-criteria sustainability assessment frame of seven attributes which were evaluated based on a choice experiment (CE) survey. According to experts the most important characteristics of RES utilisation technologies are land demand and social impacts i.e. increase in employment and local income generation. Concentrated solar power (CSP), hydropower and geothermal power plants are favourable technologies for power generation, while geothermal district heating, pellet-based non-grid heating and solar thermal heating can offer significant advantages in case of heat supply. - highlights: • We used choice experiment to estimate the weights of criteria for the sustainability assessment of RES technologies. • The most important attributes of RES technologies according to experts are land demand and social impacts. • Concentrated solar power (CSP), hydropower and geothermal power plants are advantageous technologies for power generation. • Geothermal district heating, pellet-based non-grid heating and solar thermal heating are favourable in case of heat supply

  3. Probabilistic assessment of power system transient stability incorporating SMES

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jiakun, E-mail: Jiakun.Fang@gmail.com [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Yao, Wei [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Wen, Jinyu, E-mail: jinyu.wen@hust.edu.cn [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China); Cheng, Shijie; Tang, Yuejin; Cheng, Zhuo [State Key Lab of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, No. 1037, Luoyu Road, Wuhan 430074 (China)

    2013-01-15

    Highlights: ► Probabilistic study of power system with wind farm and SMES is proposed. ► Quantitative relationship between system stability and SMES capacity is given. ► System stability increases with the capacity of the SMES. ► System stability decreases with the penetration of wind power. ► Together with the cost function, the coil size is optimized. -- Abstract: This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically.

  4. Probabilistic assessment of power system transient stability incorporating SMES

    International Nuclear Information System (INIS)

    Fang, Jiakun; Yao, Wei; Wen, Jinyu; Cheng, Shijie; Tang, Yuejin; Cheng, Zhuo

    2013-01-01

    Highlights: ► Probabilistic study of power system with wind farm and SMES is proposed. ► Quantitative relationship between system stability and SMES capacity is given. ► System stability increases with the capacity of the SMES. ► System stability decreases with the penetration of wind power. ► Together with the cost function, the coil size is optimized. -- Abstract: This paper presents a stochastic-based approach to evaluate the probabilistic transient stability index of the power system incorporating the wind farm and the SMES. Uncertain factors include both sequence of disturbance in power grid and stochastic generation of the wind farm. The spectrums of disturbance in the grid as the fault type, the fault location, the fault clearing time and the automatic reclosing process with their probabilities of occurrence are used to calculate the probability indices, while the wind speed statistics and parameters of the wind generator are used in a Monte Carlo simulation to generate samples for the studies. With the proposed method, system stability is ”measured”. Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve, together with the cost function, the coil size is optimized economically

  5. On-line Dynamic Security Assessment in Power Systems

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel

    and solar radiation. Moreover, ongoing research suggests that demand response will be introduced to maintain power balance between generation and consumption at all times. Due to these changes the operating point of the power system will be less predictable and today’s stability and security assessment...... for early prediction of critical voltage sags is described. The method’s performance is compared to other prediction approaches. The results show that the proposed method succeeds in early, accurately and consistently predicting critically low voltage sags. An efficient on-line DSA not only identifies...

  6. Fire models for assessment of nuclear power plant fires

    International Nuclear Information System (INIS)

    Nicolette, V.F.; Nowlen, S.P.

    1989-01-01

    This paper reviews the state-of-the-art in available fire models for the assessment of nuclear power plants fires. The advantages and disadvantages of three basic types of fire models (zone, field, and control volume) and Sandia's experience with these models will be discussed. It is shown that the type of fire model selected to solve a particular problem should be based on the information that is required. Areas of concern which relate to all nuclear power plant fire models are identified. 17 refs., 6 figs

  7. Observer variability in the assessment of type and dysplasia of colorectal adenomas, analyzed using kappa statistics

    DEFF Research Database (Denmark)

    Jensen, P; Krogsgaard, M R; Christiansen, J

    1995-01-01

    . The kappa values for Observer A vs. B and Observer C vs. B were 0.3480 and 0.3770, respectively (both type and dysplasia). Values for type were better than for dysplasia, but agreement was only fair to moderate. CONCLUSION: The interobserver agreement was moderate to almost perfect, but the intraobserver...... agreement was only fair to moderate. A simpler classification system or a centralization of assessments would probably increase kappa values....... of adenomas were assessed twice by three experienced pathologists, with an interval of two months. Results were analyzed using kappa statistics. RESULTS: For agreement between first and second assessment (both type and grade of dysplasia), kappa values for the three specialists were 0.5345, 0.9022, and 0...

  8. Planck 2013 results. XXI. All-sky Compton parameter power spectrum and high-order statistics

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Genova-Santos, R.T.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. These maps show an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales ($\\ell 500$) the clustered Cosmic Infrared Background (CIB) and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We measure the tSZ power spectrum in angular scales, $0.17^{\\circ} \\lesssim \\theta \\lesssim 3.0^{\\circ}$, that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with additional clear evidence of signal from unresolved clusters and, potentially, diffuse warm baryons. We use the tSZ power spectrum to ...

  9. Inferential statistics, power estimates, and study design formalities continue to suppress biomedical innovation

    OpenAIRE

    Kern, Scott E.

    2014-01-01

    Innovation is the direct intended product of certain styles in research, but not of others. Fundamental conflicts between descriptive vs inferential statistics, deductive vs inductive hypothesis testing, and exploratory vs pre-planned confirmatory research designs have been played out over decades, with winners and losers and consequences. Longstanding warnings from both academics and research-funding interests have failed to influence effectively the course of these battles. The NIH publicly...

  10. Discriminatory power of water polo game-related statistics at the 2008 Olympic Games.

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M; Mansilla, Mirella; Tella, Victor

    2011-02-01

    The aims of this study were (1) to compare water polo game-related statistics by context (winning and losing teams) and sex (men and women), and (2) to identify characteristics discriminating the performances for each sex. The game-related statistics of the 64 matches (44 men's and 20 women's) played in the final phase of the Olympic Games held in Beijing in 2008 were analysed. Unpaired t-tests compared winners and losers and men and women, and confidence intervals and effect sizes of the differences were calculated. The results were subjected to a discriminant analysis to identify the differentiating game-related statistics of the winning and losing teams. The results showed the differences between winning and losing men's teams to be in both defence and offence, whereas in women's teams they were only in offence. In men's games, passing (assists), aggressive play (exclusions), centre position effectiveness (centre shots), and goalkeeper defence (goalkeeper-blocked 5-m shots) predominated, whereas in women's games the play was more dynamic (possessions). The variable that most discriminated performance in men was goalkeeper-blocked shots, and in women shooting effectiveness (shots). These results should help coaches when planning training and competition.

  11. The power of 41%: A glimpse into the life of a statistic.

    Science.gov (United States)

    Tanis, Justin

    2016-01-01

    "Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  13. Assessment of the GPC Control Quality Using Non–Gaussian Statistical Measures

    Directory of Open Access Journals (Sweden)

    Domański Paweł D.

    2017-06-01

    Full Text Available This paper presents an alternative approach to the task of control performance assessment. Various statistical measures based on Gaussian and non-Gaussian distribution functions are evaluated. The analysis starts with the review of control error histograms followed by their statistical analysis using probability distribution functions. Simulation results obtained for a control system with the generalized predictive controller algorithm are considered. The proposed approach using Cauchy and Lévy α-stable distributions shows robustness against disturbances and enables effective control loop quality evaluation. Tests of the predictive algorithm prove its ability to detect the impact of the main controller parameters, such as the model gain, the dynamics or the prediction horizon.

  14. Micro and mini hydroelectric power assessment in Uruguay

    International Nuclear Information System (INIS)

    Nunes, V.; Genta, J.L.

    1996-01-01

    The School of Engineering in Montevideo, Uruguay, within the framework of Agreements made with the National Utility, has carried out an assessment of the potential and studies of the feasibility of the use of renewable energy for the generation of electrical power, both at the industrial level and the autonomous level for rural electrification. Original assessment methodologies were developed, including calculation tools which allow, for example, to analyze historical meteorological data, to calculate the available energy in different kinds of energy generators and also to stimulate the operation and design of autonomous systems with established load requirements and service quality. At the micro and mini hydropower assessment, the main role was placed on the census of potential users and the preliminary analysis of the representative places for the different technical solutions adequate to the variety of topographic conditions and load requirements. For power above 1 MW and up to 5 MW, the generating potential was assessed all over the country. If power lower than 1 MW or lower than 100kW (mini and micro) is considered, the information available in maps with contour lines, including in those of a 1:50,000 scale, is not enough to identify the most adequate places. Instead, knowledge of the place is indispensable in these cases. A preliminary plan of several installations was worked out. (Author)

  15. Efforts to utilize risk assessment at nuclear power plants

    International Nuclear Information System (INIS)

    Narumiya, Yoshiyuki

    2015-01-01

    Risk assessment means the use of the outputs that have been obtained through risk identification and risk analysis (risk information), followed by the determination of the response policy by comparing these outputs with the risk of judgement standards. This paper discusses the use of risk information with multifaceted nature and its significance, and the challenges to the further penetration of these items. As the lessons and risk assessment learnt from the past accidents, this paper takes up the cases of the severe accidents of Three Mile Island, Chernobyl, and Fukushima Daiichi power stations, and discusses their causes and expansion factors. In particular, at Fukushima Daiichi Nuclear Power Station, important lessons were shortage in measures against the superimposition of earthquake and tsunami, and the insufficient use of risk assessment. This paper classified risk assessment from the viewpoint of risk information, and showed the contents and index for each item of risk reduction trends, risk increase trends, and measures according to the importance of risk. As the benefits of activities due to risk assessment, this paper referred to the application cases of the probabilistic risk assessment (PRA) of IAEA, and summarized the application activities of 10 items of risk indexes by classifying them to safety benefits and operational benefits. For example, in the item of flexible Allowed Outage Time (AOT), the avoidance of plant shutdown and the flexibility improvement of maintenance scheduling at a plant are corresponding to the above-mentioned benefits, respectively. (A.O.)

  16. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review.

    Science.gov (United States)

    Lamb, Karen E; Thornton, Lukar E; Cerin, Ester; Ball, Kylie

    2015-01-01

    Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Searches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  17. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  18. Safety assessment and verification for nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2001-01-01

    This publication supports the Safety Requirements on the Safety of Nuclear Power Plants: Design. This Safety Guide was prepared on the basis of a systematic review of all the relevant publications including the Safety Fundamentals, Safety of Nuclear Power Plants: Design, current and ongoing revisions of other Safety Guides, INSAG reports and other publications that have addressed the safety of nuclear power plants. This Safety Guide also provides guidance for Contracting Parties to the Convention on Nuclear Safety in meeting their obligations under Article 14 on Assessment and Verification of Safety. The Safety Requirements publication entitled Safety of Nuclear Power Plants: Design states that a comprehensive safety assessment and an independent verification of the safety assessment shall be carried out before the design is submitted to the regulatory body. This publication provides guidance on how this requirement should be met. This Safety Guide provides recommendations to designers for carrying out a safety assessment during the initial design process and design modifications, as well as to the operating organization in carrying out independent verification of the safety assessment of new nuclear power plants with a new or already existing design. The recommendations for performing a safety assessment are suitable also as guidance for the safety review of an existing plant. The objective of reviewing existing plants against current standards and practices is to determine whether there are any deviations which would have an impact on plant safety. The methods and the recommendations of this Safety Guide can also be used by regulatory bodies for the conduct of the regulatory review and assessment. Although most recommendations of this Safety Guide are general and applicable to all types of nuclear reactors, some specific recommendations and examples apply mostly to water cooled reactors. Terms such as 'safety assessment', 'safety analysis' and 'independent

  19. Statistical issues in biological radiation dosimetry for risk assessment using stable chromosome aberrations

    International Nuclear Information System (INIS)

    Cologne, J.B.; Preston, D.L.

    1998-01-01

    Biological dosimeters are useful for epidemiologic risk assessment in populations exposed to catastrophic nuclear events and as a means of validating physical dosimetry in radiation workers. Application requires knowledge of the magnitude of uncertainty in the biological dose estimates and an understanding of potential statistical pitfalls arising from their use. This paper describes the statistical aspects of biological dosimetry in general and presents a detailed analysis in the specific case of dosimetry for risk assessment using stable chromosome aberration frequency. Biological dose estimates may be obtained from a dose-response curve, but negative estimates can result and adjustment must be made for regression bias due to imprecise estimation when the estimates are used in regression analyses. Posterior-mean estimates, derived as the mean of the distribution of true doses compatible with a given value of the biological endpoint, have several desirable properties: they are nonnegative, less sensitive to extreme skewness in the true dose distribution, and implicitly adjusted to avoid regression bias. The methods necessitate approximating the true-dose distribution in the population in which biological dosimetry is being applied, which calls for careful consideration of this distribution through other information. An important question addressed here is to what extent the methods are robust to misspecification of this distribution, because in many applications of biological dosimetry it cannot be characterized well. The findings suggest that dosimetry based solely on stable chromosome aberration frequency may be useful for population-based risk assessment

  20. Self-assessment of operational safety for nuclear power plants

    International Nuclear Information System (INIS)

    1999-12-01

    Self-assessment processes have been continuously developed by nuclear organizations, including nuclear power plants. Currently, the nuclear industry and governmental organizations are showing an increasing interest in the implementation of this process as an effective way for improving safety performance. Self-assessment involves the use of different types of tools and mechanisms to assist the organizations in assessing their own safety performance against given standards. This helps to enhance the understanding of the need for improvements, the feeling of ownership in achieving them and the safety culture as a whole. Although the primary beneficiaries of the self-assessment process are the plant and operating organization, the results of the self-assessments are also used, for example, to increase the confidence of the regulator in the safe operation of an installation, and could be used to assist in meeting obligations under the Convention on Nuclear Safety. Such considerations influence the form of assessment, as well as the type and detail of the results. The concepts developed in this report present the basic approach to self-assessment, taking into consideration experience gained during Operational Safety Review Team (OSART) missions, from organizations and utilities which have successfully implemented parts of a self-assessment programme and from meetings organized to discuss the subject. This report will be used in IAEA sponsored workshops and seminars on operational safety that include the topic of self-assessment

  1. Power station impacts: socio-economic impact assessment

    International Nuclear Information System (INIS)

    Glasson, John; Elson, Martin; Barrett, Brendan; Wee, D. Van der

    1987-01-01

    The aim of this study is to assess the local social and economic impacts of a proposed nuclear power station development at Hinkley Point in Somerset. The proposed development, Hinkley Point C, would be an addition to the existing Hinkley Point A Magnox station, commissioned in 1965, and the Hinkley Point B Advanced Gas Cooled Reactor (AGR) station, commissioned in 1976. It is hoped that the study will be of assistance to the CEGB, the Somerset County and District Councils and other agencies in their studies of the proposed development. In addition, the study seeks to apply and further develop the methodology and results from previous studies by the Power Station Impacts (PSI) team for predicting the social and economic effects of proposed power station developments on their localities. (author)

  2. Situation Aware Assessment of Regulating Power Need and Resource

    DEFF Research Database (Denmark)

    Heussen, Kai

    2009-01-01

    power plants, but have the capability to provide a number of ancillary services. It is envisioned that wind power may at times provide a certain share of system stabilization, but it must also be seen that this contribution is limited to only a part of the required functions and that it fluctuates...... with the available wind. The approach proposed in this paper uses a functional classification to sort out the control requirements of a power system with a high share of fluctuating renewable and distributed energy sources and aims to combine it with a structured quantitative assessment.......Distributed generation and renewable energy sources are both, new disturbance and new regulation resource. Which it is, depends to a large extend on the facilitation of control capabilities, that for example modern wind turbines can provide. Most renewable energy sources are quite unlike classical...

  3. Assessment of water quality of a river-dominated estuary with hydrochemical parameters: A statistical approach.

    Digital Repository Service at National Institute of Oceanography (India)

    Padma, P.; Sheela, V.S.; Suryakumari, S.; Jayalakshmy, K.V.; Nair, S.M.; Kumar, N.C.

    stream_size 64084 stream_content_type text/plain stream_name Water_Qual_Expos_Health_5_197.pdf.txt stream_source_info Water_Qual_Expos_Health_5_197.pdf.txt Content-Encoding UTF-8 Content-Type text/plain; charset=UTF-8... Water Qual Expo Health DOI 10.1007/s12403-014-0115-9 ORIGINAL PAPER Assessment of Water Quality of a River-Dominated Estuary with Hydrochemical Parameters: A Statistical Approach P. Padma · V. S. Sheela · S. Suryakumari · K. V. Jayalakshmy · S. M. Nair...

  4. Assessment of defence in depth for nuclear power plants

    International Nuclear Information System (INIS)

    2005-01-01

    Defence in depth is a comprehensive approach to safety that has been developed by nuclear power experts to ensure with high confidence that the public and the environment are protected from any hazards posed by the use of nuclear power for the generation of electricity. The concepts of defence in depth and safety culture have served the nuclear power industry well as a basic philosophy for the safe design and operation of nuclear power plants. Properly applied, defence in depth ensures that no single human error or equipment failure at one level of defence, nor even a combination of failures at more than one level of defence, propagates to jeopardize defence in depth at the subsequent level or leads to harm to the public or the environment. The importance of the concept of defence in depth is underlined in IAEA Safety Standards, in particular in the requirements set forth in the Safety Standards: Safety of Nuclear Power Plants: Design (NS-R-1) and Safety Assessment and Verification for Nuclear Power Plants (NS-G-1.2). A specific report, Defence in Depth in Nuclear Safety (INSAG-10), describes the objectives, strategy, implementation and future development in the area of defence in depth in nuclear and radiation safety. In the report Basic Safety Principles for Nuclear Power Plants (INSAG-12), defence in depth is recognized as one of the fundamental safety principles that underlie the safety of nuclear power plants. In consonance with those high level publications, this Safety Report provides more specific technical information on the implementation of this concept in the siting, design, construction and operation of nuclear power plants. It describes a method for comprehensive and balanced review of the provisions required for implementing defence in depth in existing plants. This publication is intended to provide guidance primarily for the self-assessment by plant operators of the comprehensiveness and quality of defence in depth provisions. It can be used

  5. Thermal impact assessment of multi power plant operations on estuaries

    International Nuclear Information System (INIS)

    Eraslan, A.H.; Kim, K.H.; Harris, J.L.

    1977-01-01

    The assessment of the thermal impact of multi power plant operations on large estuaries requires careful consideration of the problems associated with: re-entrainment, re-circulation, thermal interaction, delay in the attainment of thermal equilibrium state, and uncertainty in specifying open boundaries and open boundary conditions of the regions, which are critically important in the analysis of the thermal conditions in receiving water bodies with tidal dominated, periodically reversing flow conditions. The results of an extensive study in the Hudson River at Indian Point, 42 miles upstream of the ocean end at the Battery, concluded that the tidal-transient, multi-dimensional discrete-element (UTA) thermal transport models (ESTONE, FLOTWO, TMPTWO computer codes) and the near-field far-field zone-matching methodology can be employed with a high degree of reliability in the assessment of the thermal impact of multi power plant operations on tidal dominated estuaries

  6. Pooling of cross-cultural PRO data in multinational clinical trials: how much can poor measurement affect statistical power?

    Science.gov (United States)

    Regnault, Antoine; Hamel, Jean-François; Patrick, Donald L

    2015-02-01

    Cultural differences and/or poor linguistic validation of patient-reported outcome (PRO) instruments may result in differences in the assessment of the targeted concept across languages. In the context of multinational clinical trials, these measurement differences may add noise and potentially measurement bias to treatment effect estimation. Our objective was to explore the potential effect on treatment effect estimation of the "contamination" of a cultural subgroup by a flawed PRO measurement. We ran a simulation exercise in which the distribution of the score in the overall sample was considered a mixture of two normal distributions: a standard normal distribution was assumed in a "main" subgroup and a normal distribution which differed either in mean (bias) or in variance (noise) in a "contaminated" subgroup (the subgroup with potential flaws in the PRO measurement). The observed power was compared to the expected power (i.e., the power that would have been observed if the subgroup had not been contaminated). Even if differences between the expected and observed power were small, some substantial differences were obtained (up to a 0.375 point drop in power). No situation was systematically protected against loss of power. The impact of poor PRO measurement in a cultural subgroup may induce a notable drop in the study power and consequently reduce the chance of showing an actual treatment effect. These results illustrate the importance of the efforts to optimize conceptual and linguistic equivalence of PRO measures when pooling data in international clinical trials.

  7. Assessment of ELF magnetic fields produced by independent power lines

    International Nuclear Information System (INIS)

    Lucca, G.

    2008-01-01

    In this paper, the problem of assessing the ELF (extremely low-frequency) magnetic fields produced, in a certain area characterised by the presence of more than one independent power line, is faced. The use of the incoherent summation of the single contributions, as an advantageous estimator of the total magnetic field, is proposed and justified by means of a heuristic procedure. This kind of approach can be seen as a useful and practical tool to be employed in environmental impact analysis and in assessing long-term human exposure to ELF magnetic fields. (authors)

  8. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  9. Quantitative assessment of aquatic impacts of power plants

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Arnold, E.M.; Skalski, J.R.; Fickeisen, D.H.; Baker, K.S.

    1979-08-01

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers.

  10. Quantitative assessment of aquatic impacts of power plants

    International Nuclear Information System (INIS)

    McKenzie, D.H.; Arnold, E.M.; Skalski, J.R.; Fickeisen, D.H.; Baker, K.S.

    1979-08-01

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers

  11. Reliability assessment of distribution power systems including distributed generations

    International Nuclear Information System (INIS)

    Megdiche, M.

    2004-12-01

    Nowadays, power systems have reached a good level of reliability. Nevertheless, considering the modifications induced by the connections of small independent producers to distribution networks, there's a need to assess the reliability of these new systems. Distribution networks present several functional characteristics, highlighted by the qualitative study of the failures, as dispersed loads at several places, variable topology and some electrotechnical phenomena which must be taken into account to model the events that can occur. The adopted reliability calculations method is Monte Carlo simulations, the probabilistic method most powerful and most flexible to model complex operating of the distribution system. We devoted a first part on the case of a 20 kV feeder to which a cogeneration unit is connected. The method was applied to a software of stochastic Petri nets simulations. Then a second part related to the study of a low voltage power system supplied by dispersed generations. Here, the complexity of the events required to code the method in an environment of programming allowing the use of power system calculations (load flow, short-circuit, load shedding, management of units powers) in order to analyse the system state for each new event. (author)

  12. Sites and social assessment of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Nemoto, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1977-09-01

    The sites of nuclear power plants in Japan have two features, first the extreme expectation for regional development because of the selection of depopulated districts for most locations, and the second, apprehensions of local people for two reasons of nuclear power generating techniques which do not plant the roots in society and handling of radioactive materials. In order to cope with these problems, it is necessary to consider that the development plan of the regions around reactor sites must be compiled systematically. Its premise is the ''social assessment'' which estimates the economical and social influences and evaluates the merit and demerit of nuclear power plants prior to the construction. This is of course inevitable. The objects of the assessment may be divided as follows: the human effect to individuals, the institutional effect to local community, the economical effect to region, and the national influence to the whole country. While the developmental action of locations includes the stages of examination, planning, construction and operation, and three location patterns are recognized according to the emphasized function, the improvement of national economy, upgrading of environmental quality, and the most priority in local welfare. In the process of the assessment, the following items may be taken notice that each item requires sometimes the weighting; the pattern to abandon location may exist; positive and negative effects are required to be distributed evenly in a triangle having the apexes each representing one of the above three patterns.

  13. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  14. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    Science.gov (United States)

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  15. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  16. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  17. Statistical aspects of autoregressive-moving average models in the assessment of radon mitigation

    International Nuclear Information System (INIS)

    Dunn, J.E.; Henschel, D.B.

    1989-01-01

    Radon values, as reflected by hourly scintillation counts, seem dominated by major, pseudo-periodic, random fluctuations. This methodological paper reports a moderate degree of success in modeling these data using relatively simple autoregressive-moving average models to assess the effectiveness of radon mitigation techniques in existing housing. While accounting for the natural correlation of successive observations, familiar summary statistics such as steady state estimates, standard errors, confidence limits, and tests of hypothesis are produced. The Box-Jenkins approach is used throughout. In particular, intervention analysis provides an objective means of assessing the effectiveness of an active mitigation measure, such as a fan off/on cycle. Occasionally, failure to declare a significant intervention has suggested a means of remedial action in the data collection procedure

  18. A Hold-down Margin Assessment using Statistical Method for the PWR Fuel Assembly

    International Nuclear Information System (INIS)

    Jeon, S. Y.; Park, N. K.; Lee, K. S.; Kim, H. K.

    2007-01-01

    The hold-down springs provide an acceptable hold down force against hydraulic uplift force absorbing the length change of the fuel assembly relative to the space between the upper and lower core plates in PWR. These length changes are mainly due to the thermal expansion, irradiation growth and creep down of the fuel assemblies. There are two kinds of hold-down springs depending on the different design concept of the reactor internals of the PWR in Korea, one is a leaf-type hold down spring for Westinghouse type plants and the other is a coil-type hold-down spring for OPR1000 (Optimized Power Reactor 1000). There are four sets of hold-down springs in each fuel assembly for leaf type hold-down spring and each set of the hold-down springs consists of multiple tapered leaves to form a cantilever leaf spring set. The length, width and thickness of the spring leaves are selected to provide the desired spring constant, deflection range, and hold down force. There are four coil springs in each fuel assembly for coil-type hold-down spring. In this study, the hold-down forces and margins were calculated for the leaf-type and coil-type hold-down springs considering geometrical data of the fuel assembly and its components, length changes of the fuel assembly due to thermal expansion, irradiation growth, creep, and irradiation relaxation. The hold-down spring forces were calculated deterministically and statistically to investigate the benefit of the statistical calculation method in view of hold-down margin. The Monte-Carlo simulation method was used for the statistical hold down force calculation

  19. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature.

    Science.gov (United States)

    Szucs, Denes; Ioannidis, John P A

    2017-03-01

    We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64-1.46) for nominally statistically significant results and D = 0.24 (0.11-0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience.

  20. A escolha do teste estatístico - um tutorial em forma de apresentação em PowerPoint A PowerPoint®-based guide to assist in choosing the suitable statistical test

    Directory of Open Access Journals (Sweden)

    David Normando

    2010-02-01

    Full Text Available A seleção de métodos apropriados para a análise estatística pode parecer complexa, principalmente para estudantes de pós-graduação e pesquisadores no início da carreira científica. Por outro lado, a apresentação em PowerPoint é uma ferramenta comum para estudantes e pesquisadores. Assim, um tutorial de Bioestatística desenvolvido em uma apresentação em PowerPoint poderia estreitar a distância entre ortodontistas e a Bioestatística. Esse guia proporciona informações úteis e objetivas a respeito de vários métodos estatísticos empregando exemplos relacionados à Odontologia e, mais especificamente, à Ortodontia. Esse tutorial deve ser empregado, principalmente, para o usuário obter algumas respostas a questões comuns relacionadas ao teste mais apropriado para executar comparações entre grupos, examinar correlações e regressões ou analisar o erro do método. Também pode ser obtido auxílio para checar a distribuição dos dados (normal ou anormal e a escolha do gráfico mais adequado para a apresentação dos resultados. Esse guia* pode ainda ser de bastante utilidade para revisores de periódicos examinarem, de forma rápida, a adequabilidade do método estatístico apresentado em um artigo submetido à publicação.Selecting appropriate methods for statistical analysis may be difficult, especially for the students and others in the early phases of the research career. On the other hand, PowerPoint presentation is a very common tool to researchers and dental students, so a statistical guide based on PowerPoint could narrow the gap between orthodontist and the Biostatistics. This guide provides objective and useful information about several statistical methods using examples related to the dental field. A Power-Point presentation is employed to assist the user to find answers to common questions regarding Biostatistics, such as the most appropriate statistical test to compare groups, to make correlations and

  1. Probabilistic risk assessment framework for structural systems under multiple hazards using Bayesian statistics

    International Nuclear Information System (INIS)

    Kwag, Shinyoung; Gupta, Abhinav

    2017-01-01

    Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.

  2. Probabilistic risk assessment framework for structural systems under multiple hazards using Bayesian statistics

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)

    2017-04-15

    Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.

  3. Statistical analysis of the behaviour of the mechanical equipment of EDFs power plants - evaluation of the availability and safety of thermal and nuclear units

    International Nuclear Information System (INIS)

    Procaccia, H.; Brillon, A.; Cravero, M.; Lucenet, G.

    1975-01-01

    The investigation and research directorate of EDF has undertaken a statistical analysis of the behaviour of large mechanical equipment at conventional power stations during the ten years following the operating reports of these stations. It has thus been possible to determine the intrinsic reliability, the failure rate, the mean repair time, and the mean good operating time of feed water reheating points, power turbines, pumps and boilers of the various EDF plants (125 and 250 MW) leading to a consideration of the feasibility of an extrapolation to present and future plants. Based on these elementary investigation two methods of calculation have been developed. One is used to assess the overall availability of a thermal or nuclear power station based on the knowledge of the failure rates of the equipment, each piece of equipment being associated with an idea of its technical importance in the functioning of the equipment. A numerical application is given for 125 and 250 MW conventional plants. The purpose of the other method is to estimate the operational safety of the safety equipment of nuclear power stations, based on the development of tree diagrams for faults in basic equipment. A numerical example is given for the cooling systems for Phenix and for one of the Super Phenix versions. (author)

  4. Development and statistical assessment of a paper-based immunoassay for detection of tumor markers

    Energy Technology Data Exchange (ETDEWEB)

    Mazzu-Nascimento, Thiago [Instituto de Química de São Carlos, Universidade de São Paulo, 13566-590, São Carlos, SP (Brazil); Instituto Nacional de Ciência e Tecnologia de Bioanalítica, Campinas, SP (Brazil); Morbioli, Giorgio Gianini [Instituto de Química de São Carlos, Universidade de São Paulo, 13566-590, São Carlos, SP (Brazil); Instituto Nacional de Ciência e Tecnologia de Bioanalítica, Campinas, SP (Brazil); School of Chemistry and Biochemistry, Georgia Institute of Technology, Atlanta, GA 30332 (United States); Milan, Luis Aparecido [Departamento de Estatística, Universidade Federal de São Carlos, São Carlos, SP (Brazil); Donofrio, Fabiana Cristina [Instituto de Ciências da Saúde, Universidade Federal de Mato Grosso, 78557-267, Sinop, MT (Brazil); Mestriner, Carlos Alberto [Wama Produtos para Laboratório Ltda, 13560-971, São Carlos, SP (Brazil); Carrilho, Emanuel, E-mail: emanuel@iqsc.usp.br [Instituto de Química de São Carlos, Universidade de São Paulo, 13566-590, São Carlos, SP (Brazil); Instituto Nacional de Ciência e Tecnologia de Bioanalítica, Campinas, SP (Brazil)

    2017-01-15

    Paper-based assays are an attractive low-cost option for clinical chemistry testing, due to characteristics such as short time of analysis, low consumption of samples and reagents, and high portability of assays. However, little attention has been given to the evaluation of the performance of these simple tests, which should include the use of a statistical approach to define the choice of best cut-off value for the test. The choice of the cut-off value impacts on the sensitivity and specificity of the bioassay. Here, we developed a paper-based immunoassay for the detection of the carcinoembryonic antigen (CEA) and performed a statistical assessment to establish the assay's cut-off value using the Youden's J index (68.28 A.U.), what allowed for a gain in sensibility (0.86) and specificity (1.0). We also discuss about the importance of defining a gray zone as a safety margin for test (±12% over the cut-off value), eliminating all false positives and false negatives outcomes and avoiding misleading results. The test accuracy was calculated as the area under the curve (AUC) of the receiver operating characteristic (ROC) curve, presenting a value of 0.97, what classifies this test as highly accurate. We propose here a low-cost method capable of detecting carcinoembryonic antigen (CEA) in human serum samples, highlighting the importance of statistical tools to evaluate a new low-cost diagnostic method. - Highlights: • A paper-based sandwich immunoassay protocol for detection of tumor markers. • A statistical approach to define cut-off values and measuring test's sensitivity, specificity and accuracy. • A simple way to create a gray zone, avoiding false positive and false negative outcomes.

  5. Descriptive statistics of occupational employment in nuclear power utilities. Final working paper

    International Nuclear Information System (INIS)

    Little, J.R.; Johnson, R.C.

    1982-10-01

    The Institute of Nuclear Power Operations conducted a survey of its 58 member utilities during the Spring of 1982. This was the second such survey performed to identify employment trends and to project needs for trained personnel in the industry to 1991. The first was performed in 1981. The 1982 employment survey consisted of four questionnaires, asking for information on: (1) on-site employment; (2) on-site turnover; (3) off-site employment; and (4) off-site turnover. The survey instruments were designed to reflect approaches used by the utilities to meet the labor requirements for operation of nuclear power plants through off-site support personnel, contractors, and holding company personnel, as well as utility employees working at the plant site. On-site information was received from all 83 plants at the 58 utilities. However, employment information from Surry of VEPCO arrived too late to be included in the analysis. Therefore, their numbers are reflected in the adjusted totals. Responses to requests for off-site employment information were received from 55 of the 58 utilities

  6. Occupational radiation dose statistics from light-water power reactors operating in Western Europe

    International Nuclear Information System (INIS)

    Brookes, I.R.; Eng, T.

    1987-01-01

    Since the early days of nuclear power, collective and individual doses for people engaged in the maintenance and operation of nuclear power plants have been published by regulatory authorities. In 1979 a small working party whose members were drawn from Member States operating light-water reactors (LWRs) in the European Community was convened. The working party decided that only by collection of data under a unified scheme would it ever be possible to properly compare plant performance and for this reason a questionnaire was drawn up which attempted to elicit the maximum of information with the minimum inconvenience to the plant staff. Another decision made by the working party was to broaden the data base from 'European Community LWRs' to 'West European LWRs' to try to take advantage of the considerable experience being built up in Sweden, in Finland and in Switzerland. All the data available to the Commission up to the end of 1984 are presented and commented on. The deductions are not exhaustive but are believed to represent the limits of what could sensibly be done with the data available. Results are presented separately for BWR and PWR but no other subdivision, say by country or maker, is made. Where interpretation can be enhanced by graphical presentation, this is done. In general, doses for each job category are expressed in various ways to reveal and afford comparisons

  7. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  8. A new method to assess the statistical convergence of monte carlo solutions

    International Nuclear Information System (INIS)

    Forster, R.A.

    1991-01-01

    Accurate Monte Carlo confidence intervals (CIs), which are formed with an estimated mean and an estimated standard deviation, can only be created when the number of particle histories N becomes large enough so that the central limit theorem can be applied. The Monte Carlo user has a limited number of marginal methods to assess the fulfillment of this condition, such as statistical error reduction proportional to 1/√N with error magnitude guidelines and third and fourth moment estimators. A new method is presented here to assess the statistical convergence of Monte Carlo solutions by analyzing the shape of the empirical probability density function (PDF) of history scores. Related work in this area includes the derivation of analytic score distributions for a two-state Monte Carlo problem. Score distribution histograms have been generated to determine when a small number of histories accounts for a large fraction of the result. This summary describes initial studies of empirical Monte Carlo history score PDFs created from score histograms of particle transport simulations. 7 refs., 1 fig

  9. Does bisphenol A induce superfeminization in Marisa cornuarietis? Part II: toxicity test results and requirements for statistical power analyses.

    Science.gov (United States)

    Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert

    2007-03-01

    This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.

  10. Overview of environmental assessment for China nuclear power industry and coal-fired power industry

    International Nuclear Information System (INIS)

    Zhang Shaodong; Pan Ziqiang; Zhang Yongxing

    1994-01-01

    A quantitative environmental assessment method and the corresponding computer code are introduced. By the consideration of all fuel cycle steps, it given that the public health risk of China nuclear power industry is 5.2 x 10 -1 man/(GW·a) the public health risk is 2.5 man/(GW·a), and the total health risk is 3.0 man/(GW·a). After the health risk calculation for coal mining, transport, burning up and ash disposal, it gives that the public health risk of China coal-fired power industry is 3.6 man/(GW·a), the occupational health risk is 50 man/(GW·a), and the total is 54 man/(GW·). Accordingly, the conclusion that China nuclear power industry is one with high safety and cleanness is derived at the end

  11. Sustainability indicators for the assessment of nuclear power

    International Nuclear Information System (INIS)

    Stamford, Laurence; Azapagic, Adisa

    2011-01-01

    Electricity supplies an increasing share of the world's total energy demand and that contribution is set to increase. At the same time, there is increasing socio-political will to mitigate impacts of climate change as well as to improve energy security. This, in combination with the desire to ensure social and economic prosperity, creates a pressing need to consider the sustainability implications of future electricity generation. However, approaches to sustainability assessment differ greatly in their scope and methodology as currently there is no standardised approach. With this in mind, this paper reviews sustainability indicators that have previously been used to assess energy options and proposes a new sustainability assessment methodology based on a life cycle approach. In total, 43 indicators are proposed, addressing the techno-economic, environmental and social sustainability issues associated with energy systems. The framework has been developed primarily to address concerns associated with nuclear power in the UK, but is applicable to other energy technologies as well as to other countries. -- Highlights: → New framework for life cycle sustainability assessment of nuclear power developed. → The framework comprises 43 indicators addressing techno-economic, environmental and social sustainability. → Completely new indicators developed to address different sustainability issues, including nuclear proliferation, energy supply diversity and intergenerational equity. → The framework enables sustainability comparisons of nuclear and other electricity technologies. → Indicators can be used by various stakeholders, including industry, policy makers and NGOs to help identify more sustainable electricity options.

  12. Impact assessment of tornado against nuclear power plant

    International Nuclear Information System (INIS)

    Sato, Daisuke

    2015-01-01

    The impact assessment of tornado against nuclear power plants conforms to the 'Assessment guide for tornado effect on nuclear power plants' stipulated by the Nuclear Regulation Authority. In face of the assessment, important items are the setting of the maximum wind speed considered in design, and the setting of a flying object evaluation model, on the basis of observation results. The Japan Society of Maintenology summarized the verification results of the concept on the setting of tornado design and flying object valuation model, the contents of which are explained here. The following are explained: (1) validity of the setting of tornado design in the Assessment Guide, (2) analysis of synoptic field, (3) study on the regional characteristics of tornado occurrence environmental field by means of the analysis of synoptic field and gust associated index, and (4) setting of tornado design based on the above (1)-(3). Next, on the flying object evaluation model, the authors picked up the Rankine vortex model and Fujita model, and verified the reproducibility of the models using the features of each and the actual state of tornado damage. (A.O.)

  13. Statistical power and utility of meta-analysis methods for cross-phenotype genome-wide association studies.

    Science.gov (United States)

    Zhu, Zhaozhong; Anttila, Verneri; Smoller, Jordan W; Lee, Phil H

    2018-01-01

    Advances in recent genome wide association studies (GWAS) suggest that pleiotropic effects on human complex traits are widespread. A number of classic and recent meta-analysis methods have been used to identify genetic loci with pleiotropic effects, but the overall performance of these methods is not well understood. In this work, we use extensive simulations and case studies of GWAS datasets to investigate the power and type-I error rates of ten meta-analysis methods. We specifically focus on three conditions commonly encountered in the studies of multiple traits: (1) extensive heterogeneity of genetic effects; (2) characterization of trait-specific association; and (3) inflated correlation of GWAS due to overlapping samples. Although the statistical power is highly variable under distinct study conditions, we found the superior power of several methods under diverse heterogeneity. In particular, classic fixed-effects model showed surprisingly good performance when a variant is associated with more than a half of study traits. As the number of traits with null effects increases, ASSET performed the best along with competitive specificity and sensitivity. With opposite directional effects, CPASSOC featured the first-rate power. However, caution is advised when using CPASSOC for studying genetically correlated traits with overlapping samples. We conclude with a discussion of unresolved issues and directions for future research.

  14. Statistical Analysis of Power Production from OWC Type Wave Energy Converters

    DEFF Research Database (Denmark)

    Martinelli, L.; Zanuttigh, B.; Kofoed, Jens Peter

    2009-01-01

    Oscillating Water Column based wave energy plants built so far have experienced a low efficiency in the conversion of the bidirectional oscillating flow. A new concept is considered here, the LeanCon Wave Energy Converter (WEC), that unifies the flow direction by use of non-return valves...... (wave period, wave height). Average performance and stochastic variability is thus obtained for any sea state and therefore also for the annual wave climate of interest. An example application of a LeanCon unit is carried out for a location off-shore Cagliari (Italy). Conclusions provide economic......, into a unidirectional flow, making the use of more efficient air turbines possible. Hereby, a more steady flow is also obtained. The general objective of this note is to examine, the power take off (PTO) efficiency under irregular wave conditions, for WECs with flow redirection. Final practical aim is to identify...

  15. A statistical assessment of population trends for data deficient Mexican amphibians

    Directory of Open Access Journals (Sweden)

    Esther Quintero

    2014-12-01

    Full Text Available Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats.Methods. We harvested data from the Encyclopedia of Life (EOL and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions.Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  16. A statistical assessment of population trends for data deficient Mexican amphibians.

    Science.gov (United States)

    Quintero, Esther; Thessen, Anne E; Arias-Caballero, Paulina; Ayala-Orozco, Bárbara

    2014-01-01

    Background. Mexico has the world's fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species' risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL) and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  17. Preliminary nuclear power reactor technology qualitative assessment for Malaysia

    International Nuclear Information System (INIS)

    Shamsul Amri Sulaiman

    2011-01-01

    Since the worlds first nuclear reactor major breakthrough in December 02, 1942, the nuclear power industry has undergone tremendous development and evolution for more than half a century. After surpassing moratorium of nuclear power plant construction caused by catastrophic accidents at Three-mile island (1979) and Chernobyl (1986), today, nuclear energy is back on the policy agendas of many countries, both developed and developing, signaling nuclear revival or nuclear renaissance. Selection of suitable nuclear power technology has thus been subjected to primary attention. This short paper attempts to draw preliminary technology assessment for the first nuclear power reactor technology for Malaysia. Methodology employed is qualitative analysis collating recent finding of tnb-kepco preliminary feasibility study for nuclear power program in peninsular malaysia and other published presentations and/or papers by multiple experts. The results suggested that pressurized water reactor (PWR) is the prevailing technology in terms of numbers and plant performances, and while the commercialization of generation IV reactors is remote (e.g. Not until 2030), generation III/ III+ NPP models are commercially available on the market today. Five (5) major steps involved in reactor technology selection were introduced with a focus on introducing important aspects of selection criteria. Three (3) categories for the of reactor technology selection were used for the cursory evaluation. The outcome of these analyses shall lead to deeper and full analyses of the recommended reactor technologies for a comprehensive feasibility study in the near future. Recommendations for reactor technology option were also provided for both strategic and technical recommendations. The paper shall also implore the best way to select systematically the first civilian nuclear power reactor. (Author)

  18. Probabilistic risk assessment in nuclear power plant regulation

    Energy Technology Data Exchange (ETDEWEB)

    Wall, J B

    1980-09-01

    A specific program is recommended to utilize more effectively probabilistic risk assessment in nuclear power plant regulation. It is based upon the engineering insights from the Reactor Safety Study (WASH-1400) and some follow-on risk assessment research by USNRC. The Three Mile Island accident is briefly discussed from a risk viewpoint to illustrate a weakness in current practice. The development of a probabilistic safety goal is recommended with some suggestions on underlying principles. Some ongoing work on risk perception and the draft probabilistic safety goal being reviewed on Canada is described. Some suggestions are offered on further risk assessment research. Finally, some recent U.S. Nuclear Regulatory Commission actions are described.

  19. Preliminary environmental assessment for the satellite power system (SPS)

    Energy Technology Data Exchange (ETDEWEB)

    1978-10-01

    A preliminary assessment of the impact of the Satellite Power System (SPS) on the environment is presented. Information that has appeared in documents referenced herein is integrated and assimilated. The state-of-knowledge as perceived from recently completed DOE-sponsored studies is disclosed, and prospective research and study programs that can advance the state-of-knowledge and provide an expanded data base for use in an assessment planned for 1980 are defined. Alternatives for research that may be implemented in order to achieve this advancement are also discussed in order that a plan can be selected which will be consistent with the fiscal and time constraints on the SPS Environmental Assessment Program. Health and ecological effects of microwave radiation, nonmicrowave effects on health and the environment (terrestrial operations and space operations), effects on the atmosphere, and effects on communications systems are examined in detail. (WHK)

  20. Wind power prognosis statistical system; Sistema estadistico de pronostico de la energia eoloelectrica

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Garcia, Alfredo; De la Torre Vega, Eli [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2009-07-01

    The integration of the first Aeolian farm of large scale (La Venta II) to the National Interconnected System requires taking into account the random and discontinuous nature of the Aeolian energy. An important tool, for this task, is a system for the prognosis of the Aeolian energy in the short term. For this reason, the Instituto of Investigaciones Electricas (IIE) developed a statistical model to realize this prognosis. The prediction is done through an adaptable linear combination of alternative competing models, where the weights given to each model are based on its more recent prognosis quality. Also, the application results of the prognoses system are presented and analyzed. [Spanish] La integracion de la primera grana eolica de gran escala (La Venta II) al Sistema Interconectado Nacional requiere tomar en cuenta la naturaleza aleatoria y discontinua de la energia eolica. Una importante herramienta, para esta tarea, es un sistema para el pronostico de la energia eolica a corto plazo. Por ello, el Instituto de Investigaciones Electricas (IIE) desarrollo un modelo estadistico para realizar este pronostico. La prediccion es hecha a traves de una combinacion lineal adaptable de modelos competidores alternativos, donde los pesos dados a cada modelo estan basados en su mas reciente calidad de pronostico. Tambien se presentan y analizan los resultados de la aplicacion del sistema de pronosticos.

  1. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 1

    International Nuclear Information System (INIS)

    1983-01-01

    This documentation of the activities of the Oeko-Institut is intended to show errors made and limits encountered in the experimental approaches and in results obtained by the work performed under phase A of the German Risk Assessment Study on Nuclear Power Plants (DRS). Concern is expressed and explained relating to the risk definition used in the Study, and the results of other studies relied on; specific problems of methodology are discussed with regard to the value of fault-tree/accident analyses for describing the course of safety-related events, and to the evaluations presented in the DRS. The Markov model is explained as an approach offering alternative solutions. The identification and quantification of common-mode failures is discussed. Origin, quality and methods of assessing the reliability characteristics used in the DRS as well as the statistical models for describing failure scenarios of reactor components and systems are critically reviewed. (RF) [de

  2. Groundwater vulnerability assessment: from overlay methods to statistical methods in the Lombardy Plain area

    Directory of Open Access Journals (Sweden)

    Stefania Stevenazzi

    2017-06-01

    Full Text Available Groundwater is among the most important freshwater resources. Worldwide, aquifers are experiencing an increasing threat of pollution from urbanization, industrial development, agricultural activities and mining enterprise. Thus, practical actions, strategies and solutions to protect groundwater from these anthropogenic sources are widely required. The most efficient tool, which helps supporting land use planning, while protecting groundwater from contamination, is represented by groundwater vulnerability assessment. Over the years, several methods assessing groundwater vulnerability have been developed: overlay and index methods, statistical and process-based methods. All methods are means to synthesize complex hydrogeological information into a unique document, which is a groundwater vulnerability map, useable by planners, decision and policy makers, geoscientists and the public. Although it is not possible to identify an approach which could be the best one for all situations, the final product should always be scientific defensible, meaningful and reliable. Nevertheless, various methods may produce very different results at any given site. Thus, reasons for similarities and differences need to be deeply investigated. This study demonstrates the reliability and flexibility of a spatial statistical method to assess groundwater vulnerability to contamination at a regional scale. The Lombardy Plain case study is particularly interesting for its long history of groundwater monitoring (quality and quantity, availability of hydrogeological data, and combined presence of various anthropogenic sources of contamination. Recent updates of the regional water protection plan have raised the necessity of realizing more flexible, reliable and accurate groundwater vulnerability maps. A comparison of groundwater vulnerability maps obtained through different approaches and developed in a time span of several years has demonstrated the relevance of the

  3. Assessment of metals bioavailability to vegetables under field conditions using DGT, single extractions and multivariate statistics

    Science.gov (United States)

    2012-01-01

    Background The metals bioavailability in soils is commonly assessed by chemical extractions; however a generally accepted method is not yet established. In this study, the effectiveness of Diffusive Gradients in Thin-films (DGT) technique and single extractions in the assessment of metals bioaccumulation in vegetables, and the influence of soil parameters on phytoavailability were evaluated using multivariate statistics. Soil and plants grown in vegetable gardens from mining-affected rural areas, NW Romania, were collected and analysed. Results Pseudo-total metal content of Cu, Zn and Cd in soil ranged between 17.3-146 mg kg-1, 141–833 mg kg-1 and 0.15-2.05 mg kg-1, respectively, showing enriched contents of these elements. High degrees of metals extractability in 1M HCl and even in 1M NH4Cl were observed. Despite the relatively high total metal concentrations in soil, those found in vegetables were comparable to values typically reported for agricultural crops, probably due to the low concentrations of metals in soil solution (Csoln) and low effective concentrations (CE), assessed by DGT technique. Among the analysed vegetables, the highest metal concentrations were found in carrots roots. By applying multivariate statistics, it was found that CE, Csoln and extraction in 1M NH4Cl, were better predictors for metals bioavailability than the acid extractions applied in this study. Copper transfer to vegetables was strongly influenced by soil organic carbon (OC) and cation exchange capacity (CEC), while pH had a higher influence on Cd transfer from soil to plants. Conclusions The results showed that DGT can be used for general evaluation of the risks associated to soil contamination with Cu, Zn and Cd in field conditions. Although quantitative information on metals transfer from soil to vegetables was not observed. PMID:23079133

  4. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-11-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  5. Environmental impact assessment. Ajka Mining and Power Company

    International Nuclear Information System (INIS)

    Sipkema, Arjan; De Visser, Petra

    1994-01-01

    An Environmental Impact Assessment (EIA) is a public document which evaluates the impact of a new company or a new project on the environment and it also lays out the possible alternatives. The present EIA was worked out to get an insight into the polluting effects of the Ajka Mining and Power Company in Ajka, Hungary and to understand what hinders the abatement of the pollution. The Ajka coal has a high sulphur content and is slightly radioactive. The Power Plant is situated in the neighborhood of the town Ajka and the wind usually blows the releases in the direction of the town. The radioactive sludge is also stored at the border of the town and its radioactivity exceeds the limit set for the Paks Power Plant (in Hungary). Alternatives for the present technology are explored. Nil-condensation and/or energy conservation seem to be the best alternatives. Theoretically, the Regional Environmental Inspectorate is responsible for all survey of pollution, which they monitor with their own equipment, with data obtained from the company or from other monitoring companies. However, the pollution of the Ajka Mining and Power Company is not completely monitored. (authors)

  6. Assessment of environmental external effects in power generation

    International Nuclear Information System (INIS)

    Meyer, H.; Morthorst, P.E.; Schleisner, L.; Meyer, N.I.; Nielsen, P.S.; Nielsen, V.

    1996-12-01

    This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetize the external effects in the production of energy, especially in relation to renewable technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared to the production of electricity based on a coal-fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas. In the report the individual externalities from the different ways of producing energy are identified, the stress caused by the effect is assessed, and finally the monetary value of the damage is estimated. The method is applied to the local as well as the regional and global externalities. (au) 8 tabs., 7 ills., 4 refs

  7. Quadrennial Technology Review 2015: Technology Assessments--Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2015-10-07

    Wind power has become a mainstream power source in the U.S. electricity portfolio, supplying 4.9% of the nation’s electricity demand in 2014. With more than 65 GW installed across 39 states at the end of 2014, utility-scale wind power is a cost-effective source of low-emissions power generation throughout much of the nation. The United States has significant sustainable land-based and offshore wind resource potential, greater than 10 times current total U.S. electricity consumption. A technical wind resource assessment conducted by the Department of Energy (DOE) in 2009 estimated that the land-based wind energy potential for the contiguous United States is equivalent to 10,500 GW capacity at 80 meters (m) hub and 12,000 GW capacity at 100 meters (m) hub heights, assuming a capacity factor of at least 30%. A subsequent 2010 DOE report estimated the technical offshore wind energy potential to be 4,150 GW. The estimate was calculated from the total offshore area within 50 nautical miles of shore in areas where average annual wind speeds are at least 7 m per second at a hub height of 90 m.

  8. Assessment of environmental external effects in power generation

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, H.; Morthorst, P.E.; Schleisner, L. [Risoe National Lab. (Denmark); Meyer, N.I.; Nielsen, P.S.; Nielsen, V. [The Technical Univ. of Denmark (Denmark)

    1996-12-01

    This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetize the external effects in the production of energy, especially in relation to renewable technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared to the production of electricity based on a coal-fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas. In the report the individual externalities from the different ways of producing energy are identified, the stress caused by the effect is assessed, and finally the monetary value of the damage is estimated. The method is applied to the local as well as the regional and global externalities. (au) 8 tabs., 7 ills., 4 refs.

  9. Wide Area Measurement Based Security Assessment & Monitoring of Modern Power System: A Danish Power System Case Study

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2013-01-01

    Power System security has become a major concern across the global power system community. This paper presents wide area measurement system (WAMS) based security assessment and monitoring of modern power system. A new three dimensional security index (TDSI) has been proposed for online security...... monitoring of modern power system with large scale renewable energy penetration. Phasor measurement unit (PMU) based WAMS has been implemented in western Danish Power System to realize online security monitoring and assessment in power system control center. The proposed security monitoring system has been...

  10. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    Science.gov (United States)

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially

  11. Assessing socioeconomic vulnerability to dengue fever in Cali, Colombia: statistical vs expert-based modeling.

    Science.gov (United States)

    Hagenlocher, Michael; Delmelle, Eric; Casas, Irene; Kienberger, Stefan

    2013-08-14

    As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability "hotspots" into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention

  12. Security assessment for intentional island operation in modern power system

    DEFF Research Database (Denmark)

    Chen, Yu; Xu, Zhao; Østergaard, Jacob

    2011-01-01

    be increased. However, when to island or how to ensure the islanded systems can survive the islanding transition is uncertain. This article proposes an Islanding Security Region (ISR) concept to provide security assessment of island operation. By comparing the system operating state with the ISR, the system......There has been a high penetration level of Distributed Generations (DGs) in distribution systems in Denmark. Even more DGs are expected to be installed in the coming years. With that, to utilize them in maintaining the security of power supply is of great concern for Danish utilities. During...... the emergency in the power system, some distribution networks may be intentionally separated from the main grid to avoid complete system collapse. If DGs in those networks could continuously run instead of immediately being shut down, the blackout could be avoided and the reliability of supply could...

  13. POWER LOSSES ASSESSMENT IN TRANSFORMERS AFTER THE NORMATIVE OPERATING PERIOD

    Directory of Open Access Journals (Sweden)

    M. I. Fursanov

    2015-01-01

    Full Text Available The capacity losses values both loading and off-load are topmost parameters characterizing the distribution mains customers’ transformers operating effectiveness. Precise determination of the specified values facilitates substantiated choice of the optimizing procedures. The actuality of the given topic increases owing to the fact that the modern electric grid utilizes plenty of the oil-transformers whose time in commission considerably exceeds the statutory 25 years. Under the conditions of continued operation the power-losses measurement according to the functioning guidelines does not seem always possible.The authors present an improved power-losses assessment technique based on the currently accepted thermal model of the oil-transformer. They indicate the deficiency of the existing technique and substantiate some of the changes in practical application of the mathematical model. The article makes an emphasis on peculiarities of the temperature changes in the oil-transformer and offers a prototype device of open architecture for realizing the improved technique of the power-losses measurement. The paper describes the device design features and functionality options and depicts its sketchy schematic. The authors note the potential of additional to assessing the power-losses volume, transmitting the obtained information to the dispatcher  via  GSM-connection  for  simplification  of  the  transformer  status  monitoring; as well as the capability of integrating the device into the system of the transformer thermal protection. The practical merit and application scope of the obtained results are in development and choice of the optimizing measures to be taken in the distributive electrical grids, e. g. the transformer replacement.

  14. Online Sensor Calibration Assessment in Nuclear Power Systems

    International Nuclear Information System (INIS)

    Coble, Jamie B.; Ramuhalli, Pradeep; Meyer, Ryan M.; Hashemian, Hash

    2013-01-01

    Safe, efficient, and economic operation of nuclear systems (nuclear power plants, fuel fabrication and storage, used fuel processing, etc.) relies on transmission of accurate and reliable measurements. During operation, sensors degrade due to age, environmental exposure, and maintenance interventions. Sensor degradation can affect the measured and transmitted signals, including sensor failure, signal drift, sensor response time, etc. Currently, periodic sensor recalibration is performed to avoid these problems. Sensor recalibration activities include both calibration assessment and adjustment (if necessary). In nuclear power plants, periodic recalibration of safety-related sensors is required by the plant technical specifications. Recalibration typically occurs during refueling outages (about every 18 to 24 months). Non-safety-related sensors also undergo recalibration, though not as frequently. However, this approach to maintaining sensor calibration and performance is time-consuming and expensive, leading to unnecessary maintenance, increased radiation exposure to maintenance personnel, and potential damage to sensors. Online monitoring (OLM) of sensor performance is a non-invasive approach to assess instrument calibration. OLM can mitigate many of the limitations of the current periodic recalibration practice by providing more frequent assessment of calibration and identifying those sensors that are operating outside of calibration tolerance limits without removing sensors or interrupting operation. This can support extended operating intervals for unfaulted sensors and target recalibration efforts to only degraded sensors

  15. Assessment of hi-resolution multi-ensemble statistical downscaling regional climate scenarios over Japan

    Science.gov (United States)

    Dairaku, K.

    2017-12-01

    The Asia-Pacific regions are increasingly threatened by large scale natural disasters. Growing concerns that loss and damages of natural disasters are projected to further exacerbate by climate change and socio-economic change. Climate information and services for risk assessments are of great concern. Fundamental regional climate information is indispensable for understanding changing climate and making decisions on when and how to act. To meet with the needs of stakeholders such as National/local governments, spatio-temporal comprehensive and consistent information is necessary and useful for decision making. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 37 GCMs (RCP8.5) and a statistical downscaling (Bias Corrected Spatial Disaggregation (BCSD)) to investigate uncertainty of projected change associated with structural differences of the GCMs for the periods of historical climate (1950-2005) and near future climate (2026-2050). Statistical downscaling regional climate scenarios show good performance for annual and seasonal averages for precipitation and temperature. The regional climate scenarios show systematic underestimate of extreme events such as hot days of over 35 Celsius and annual maximum daily precipitation because of the interpolation processes in the BCSD method. Each model projected different responses in near future climate because of structural differences. The most of CMIP5 37 models show qualitatively consistent increase of average and extreme temperature and precipitation. The added values of statistical/dynamical downscaling methods are also investigated for locally forced nonlinear phenomena, extreme events.

  16. Statistical power to detect genetic (covariance of complex traits using SNP data in unrelated samples.

    Directory of Open Access Journals (Sweden)

    Peter M Visscher

    2014-04-01

    Full Text Available We have recently developed analysis methods (GREML to estimate the genetic variance of a complex trait/disease and the genetic correlation between two complex traits/diseases using genome-wide single nucleotide polymorphism (SNP data in unrelated individuals. Here we use analytical derivations and simulations to quantify the sampling variance of the estimate of the proportion of phenotypic variance captured by all SNPs for quantitative traits and case-control studies. We also derive the approximate sampling variance of the estimate of a genetic correlation in a bivariate analysis, when two complex traits are either measured on the same or different individuals. We show that the sampling variance is inversely proportional to the number of pairwise contrasts in the analysis and to the variance in SNP-derived genetic relationships. For bivariate analysis, the sampling variance of the genetic correlation additionally depends on the harmonic mean of the proportion of variance explained by the SNPs for the two traits and the genetic correlation between the traits, and depends on the phenotypic correlation when the traits are measured on the same individuals. We provide an online tool for calculating the power of detecting genetic (covariation using genome-wide SNP data. The new theory and online tool will be helpful to plan experimental designs to estimate the missing heritability that has not yet been fully revealed through genome-wide association studies, and to estimate the genetic overlap between complex traits (diseases in particular when the traits (diseases are not measured on the same samples.

  17. Developing new methodology for nuclear power plants vulnerability assessment

    International Nuclear Information System (INIS)

    Kostadinov, Venceslav

    2011-01-01

    Research highlights: → Paper presents new methodology for vulnerability assessment of nuclear power plants. → First universal quantitative risks assessment model for terrorist attack on a NPPs. → New model enhance security, reliability and safe operation of all energy infrastructure. → Significant research benefits: increased NPPs security, reliability and availability. → Useful new tool for PRA application to evaluation of terrorist threats on NPPs. - Abstract: The fundamental aim of an efficient regulatory emergency preparedness and response system is to provide sustained emergency readiness and to prevent emergency situations and accidents. But when an event occurs, the regulatory mission is to mitigate consequences and to protect people and the environment against nuclear and radiological damage. The regulatory emergency response system, which would be activated in the case of a nuclear and/or radiological emergency and release of radioactivity to the environment, is an important element of a comprehensive national regulatory system of nuclear and radiation safety. In the past, national emergency systems explicitly did not include vulnerability assessments of the critical nuclear infrastructure as an important part of a comprehensive preparedness framework. But after the huge terrorist attack on 11/09/2001, decision-makers became aware that critical nuclear infrastructure could also be an attractive target to terrorism, with the purpose of using the physical and radioactive properties of the nuclear material to cause mass casualties, property damage, and detrimental economic and/or environmental impacts. The necessity to evaluate critical nuclear infrastructure vulnerability to threats like human errors, terrorist attacks and natural disasters, as well as preparation of emergency response plans with estimation of optimized costs, are of vital importance for assurance of safe nuclear facilities operation and national security. In this paper presented

  18. Assessment of control rooms of nuclear power plants

    International Nuclear Information System (INIS)

    Norros, L.; Ranta, J.; Wahlstroem, B.

    1983-05-01

    To identify and correct the lacks in control rooms of operating power plants and plants under construction an extensive program has been started in the USA. In Finland as in other countries using nuclear power, the development in the USA particularly with regard to the requirements imposed on nuclear power plants is carefully followed. The changes in these requirements are sooner or later also reflected in the guidelines given by the Finnish authorities. It is therefore important to be able to form a notion of how the new requirements apply to Finnish conditions. Especially it is important to review the latest assessment guidelines for control room implementation (NUREG-0700). Thus we can avoid possible over hasty conclusions. The aim of the analysis of the method and experiments presented in NUREG 0700 report was to create a basis for assessment of the suitability of the method for Finnish control room implementation. The task group has made a general methodical analysis of the method, and partly tried it in assessment of the TVO2 control room. It is obvious that direct conclusions from the American situation are misleading. It can be considered unfeasible to follow the American requirements as such, because they can lead to unwanted results. If the review is limited to control room details, the NRC program (checklist) can be considered successful. It can also be used during planning to observation of small discrepancies. However, we can question the applicability of some requirements. It is, though, more essential that the control room entity has neither in this nor in several other programs been reached or standardized. In spite of the difficulties we should try to reach this most important goal. (author)

  19. OPLS statistical model versus linear regression to assess sonographic predictors of stroke prognosis.

    Science.gov (United States)

    Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi

    2012-01-01

    The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.

  20. Targeting change: Assessing a faculty learning community focused on increasing statistics content in life science curricula.

    Science.gov (United States)

    Parker, Loran Carleton; Gleichsner, Alyssa M; Adedokun, Omolola A; Forney, James

    2016-11-12

    Transformation of research in all biological fields necessitates the design, analysis and, interpretation of large data sets. Preparing students with the requisite skills in experimental design, statistical analysis, and interpretation, and mathematical reasoning will require both curricular reform and faculty who are willing and able to integrate mathematical and statistical concepts into their life science courses. A new Faculty Learning Community (FLC) was constituted each year for four years to assist in the transformation of the life sciences curriculum and faculty at a large, Midwestern research university. Participants were interviewed after participation and surveyed before and after participation to assess the impact of the FLC on their attitudes toward teaching, perceived pedagogical skills, and planned teaching practice. Overall, the FLC had a meaningful positive impact on participants' attitudes toward teaching, knowledge about teaching, and perceived pedagogical skills. Interestingly, confidence for viewing the classroom as a site for research about teaching declined. Implications for the creation and development of FLCs for science faculty are discussed. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(6):517-525, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  1. No-reference image quality assessment based on statistics of convolution feature maps

    Science.gov (United States)

    Lv, Xiaoxin; Qin, Min; Chen, Xiaohui; Wei, Guo

    2018-04-01

    We propose a Convolutional Feature Maps (CFM) driven approach to accurately predict image quality. Our motivation bases on the finding that the Nature Scene Statistic (NSS) features on convolution feature maps are significantly sensitive to distortion degree of an image. In our method, a Convolutional Neural Network (CNN) is trained to obtain kernels for generating CFM. We design a forward NSS layer which performs on CFM to better extract NSS features. The quality aware features derived from the output of NSS layer is effective to describe the distortion type and degree an image suffered. Finally, a Support Vector Regression (SVR) is employed in our No-Reference Image Quality Assessment (NR-IQA) model to predict a subjective quality score of a distorted image. Experiments conducted on two public databases demonstrate the promising performance of the proposed method is competitive to state of the art NR-IQA methods.

  2. Assessment of tritium breeding requirements for fusion power reactors

    International Nuclear Information System (INIS)

    Jung, J.

    1983-12-01

    This report presents an assessment of tritium-breeding requirements for fusion power reactors. The analysis is based on an evaluation of time-dependent tritium inventories in the reactor system. The method presented can be applied to any fusion systems in operation on a steady-state mode as well as on a pulsed mode. As an example, the UWMAK-I design was analyzed and it has been found that the startup inventory requirement calculated by the present method significantly differs from those previously calculated. The effect of reactor-parameter changes on the required tritium breeding ratio is also analyzed for a variety of reactor operation scenarios

  3. Selection, competency development and assessment of nuclear power plant managers

    International Nuclear Information System (INIS)

    1998-06-01

    This publication provides information on proven methods and good practices with respect to the selection, development and assessment of nuclear power plant (NPP) managers. The report is organized into four sections, a glossary, two appendices, and several annexes. The Introduction (Section 1) provides the framework for the report. Section 2 describes how appropriate management competencies can be used for the selection, development and assessment of NPP managers, including: -Selection which includes recruitment, promotion and succession management. -Management development programmes including formal training, job rotation, on the job training, mentoring, and outside assignments. -Assessment of individual performance. Section 3 describes a systematic process for identifying the competencies needed by NPP managers. This section culminates in a set of suggested core competencies for NPP managers which are further expanded in Appendix A. The annexes included provide specific examples of competency-based management selection, development, and assessment programmes in several Member States. -Annex A is one method to organize and display competencies. -Annex B is an example of using competencies for selection of first line managers. -Annex C is an example of using management competencies for succession management. -Annexes -H are examples of management development programmes. -Annexes I and J are examples of management assessment programmes. A glossary of terms is provided at the end of the report to explain the use of some key terms explain the use of some key terms

  4. Statistical modeling of complex health outcomes and air pollution data: Application of air quality health indexing for asthma risk assessment

    Directory of Open Access Journals (Sweden)

    Swarna Weerasinghe

    2017-03-01

    Conclusion:  This study demonstrated the importance of complex statistical model use and the consequences of lack of such modelling that accounted for data structures in public health risk assessments.

  5. Statistical and Measurement Properties of Features Used in Essay Assessment. Research Report. ETS RR-04-21

    Science.gov (United States)

    Haberman, Shelby J.

    2004-01-01

    Statistical and measurement properties are examined for features used in essay assessment to determine the generalizability of the features across populations, prompts, and individuals. Data are employed from TOEFL® and GMAT® examinations and from writing for Criterion?.

  6. A Framework for Assessing the Commercialization of Photovoltaic Power Generation

    Science.gov (United States)

    Yaqub, Mahdi

    An effective framework does not currently exist with which to assess the viability of commercializing photovoltaic (PV) power generation in the US energy market. Adopting a new technology, such as utility-scale PV power generation, requires a commercialization assessment framework. The framework developed here assesses the economic viability of a set of alternatives of identified factors. Economic viability focuses on simulating the levelized cost of electricity (LCOE) as a key performance measure to realize `grid parity', or the equivalence between the PV electricity prices and grid electricity prices for established energy technologies. Simulation results confirm that `grid parity' could be achieved without the current federal 30% investment tax credit (ITC) via a combination of three strategies: 1) using economies of scale to reduce the LCOE by 30% from its current value of 3.6 cents/kWh to 2.5 cents/kWh, 2) employing a longer power purchase agreement (PPA) over 30 years at a 4% interest rate, and 3) improving by 15% the "capacity factor", which is the ratio of the total annual generated energy to the full potential annual generation when the utility is continuously operating at its rated output. The lower than commercial-market interest rate of 4% that is needed to realize `grid parity' is intended to replace the current federal 30% ITC subsidy, which does not have a cash inflow to offset the outflow of subsidy payments. The 4% interest rate can be realized through two proposed finance plans: The first plan involves the implementation of carbon fees on polluting power plants to produce the capital needed to lower the utility PPA loan term interest rate from its current 7% to the necessary 4% rate. The second plan entails a proposed public debt finance plan. Under this plan, the US Government leverages its guarantee power to issue bonds and uses the proceeds to finance the construction and operation of PV power plants with PPA loan with a 4% interest rate for a

  7. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  8. Performance assessment of topologically diverse power systems subjected to hurricane events

    International Nuclear Information System (INIS)

    Winkler, James; Duenas-Osorio, Leonardo; Stein, Robert; Subramanian, Devika

    2010-01-01

    Large tropical cyclones cause severe damage to major cities along the United States Gulf Coast annually. A diverse collection of engineering and statistical models are currently used to estimate the geographical distribution of power outage probabilities stemming from these hurricanes to aid in storm preparedness and recovery efforts. Graph theoretic studies of power networks have separately attempted to link abstract network topology to transmission and distribution system reliability. However, few works have employed both techniques to unravel the intimate connection between network damage arising from storms, topology, and system reliability. This investigation presents a new methodology combining hurricane damage predictions and topological assessment to characterize the impact of hurricanes upon power system reliability. Component fragility models are applied to predict failure probability for individual transmission and distribution power network elements simultaneously. The damage model is calibrated using power network component failure data for Harris County, TX, USA caused by Hurricane Ike in September of 2008, resulting in a mean outage prediction error of 15.59% and low standard deviation. Simulated hurricane events are then applied to measure the hurricane reliability of three topologically distinct transmission networks. The rate of system performance decline is shown to depend on their topological structure. Reliability is found to correlate directly with topological features, such as network meshedness, centrality, and clustering, and the compact irregular ring mesh topology is identified as particularly favorable, which can influence regional lifeline policy for retrofit and hardening activities to withstand hurricane events.

  9. Electric loading of power transmission lines - statistical methodology and real time management; Carregamento eletrico de linhas de transmissao - metodologia estatistica e gerenciamento em tempo real

    Energy Technology Data Exchange (ETDEWEB)

    Diniz, Jose H.; Leao, Sergio L.C.; Paim, Oswaldo; Martins, Jose A.; Costa, Jonas A. da [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil)

    1991-12-31

    This paper shows a computerized system, that adopt a statistical methodology and simultaneous measurements of wind velocity and environment temperature, as an experiences at the power transmission lines with environmental monitoring systems. 9 figs., 16 refs.

  10. Assessment of the Lillgrund Windfarm, Power Performance and Wake Effects. Lillgrund Pilot Project

    Energy Technology Data Exchange (ETDEWEB)

    Dahlberg, Jan-Aake (Vattenfall Vindkraft AB, Stockholm (Sweden))

    2009-06-15

    In this report, an assessment of the power performance of individual turbines, as well as for the whole wind farm, Lillgrund, is presented. By using the nearby meteorological mast, a power performance assessment, in line with international standards, has been carried out for three turbines located close to the met mast, as well as for the whole farm. The derived power curves for the single turbines are almost identical, and slightly better than, the power curves given in WindPro. The assessment of the power performance of the whole wind farm resulted in an average power curve that is significantly lower than the power curve for an undisturbed turbine. The overall energy efficiency of the farm, calculated from the measured wind farm power curve and assuming a Rayleigh distributed wind speed with an annual average value of 8.0 m/s, gives an efficiency value of about 77%. The losses are relatively high, which is not surprising, for such a dense wind farm configuration. The Lillgrund wind farm is considered to have a very dense configuration and it is, therefore, of great interest to investigate how shading effects (wake effects) will influence the production. The main objective of the project has been to analyze the power output of the whole wind farm for different wind directions and wind speeds and thus identify and quantify the wake effects. Shading effects are defined as the power ratio between the power output of one or more selected object turbines and the power levels of one or more reference turbines, located up front. Shading effects have been identified for a number of cases and can be clearly demonstrated. Significant wake effects occur when the wind is blowing along a row of turbines. The maximum peak loss occurs for the second turbine in the row and is, for inter row spacing of 4.4xD, typically 70%, and for row spacing of 3.3xD, typically 80%. One assumption that has been adopted is that power reduction only occurs for production below rated wind speeds

  11. Developing and assessing accident management plans for nuclear power plants

    International Nuclear Information System (INIS)

    Hanson, D.J.; Johnson, S.P.; Blackman, H.S.; Stewart, M.A.

    1992-07-01

    This document is the second of a two-volume NUREG/CR that discusses development of accident management plans for nuclear power plants. The first volume (a) describes a four-phase approach for developing criteria that could be used for assessing the adequacy of accident management plans, (b) identifies the general attributes of accident management plans (Phase 1), (c) presents a prototype process for developing and implementing severe accident management plans (Phase 2), and (d) presents criteria that can be used to assess the adequacy of accident management plans. This volume (a) describes results from an evaluation of the capabilities of the prototype process to produce an accident management plan (Phase 3) and (b), based on these results and preliminary criteria included in NUREG/CR-5543, presents modifications to the criteria where appropriate

  12. Power plant system assessment. Final report. SP-100 Program

    International Nuclear Information System (INIS)

    Anderson, R.V.; Atkins, D.F.; Bost, D.S.

    1983-01-01

    The purpose of this assessment was to provide system-level insights into 100-kWe-class space reactor electric systems. Using these insights, Rockwell was to select and perform conceptual design studies on a ''most attractive'' system that met the preliminary design goals and requirements of the SP-100 Program. About 4 of the 6 months were used in the selection process. The remaining 2 months were used for the system conceptual design studies. Rockwell completed these studies at the end of FY 1983. This report summarizes the results of the power plant system assessment and describes our choice for the most attractive system - the Rockwell SR-100G System (Space Reactor, 100 kWe, Growth) - a lithium-cooled UN-fueled fast reactor/Brayton turboelectric converter system

  13. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    International Nuclear Information System (INIS)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M.; Som, Seu S.; Haindl, Walter; Bye, Ann M.E.

    2004-01-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8±4.3 years; mean±SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the potential

  14. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Som, Seu S.; Haindl, Walter [Department of Nuclear Medicine, Prince of Wales Hospital, Randwick, New South Wales (Australia); Bye, Ann M.E. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Department of Neurology, Sydney Children' s Hospital, High Street, 2031, Randwick, NSW (Australia)

    2004-03-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8{+-}4.3 years; mean{+-}SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the

  15. A statistical analysis to assess the maturity and stability of six composts.

    Science.gov (United States)

    Komilis, Dimitrios P; Tziouvaras, Ioannis S

    2009-05-01

    Despite the long-time application of organic waste derived composts to crops, there is still no universally accepted index to assess compost maturity and stability. The research presented in this article investigated the suitability of seven types of seeds for use in germination bioassays to assess the maturity and phytotoxicity of six composts. The composts used in the study were derived from cow manure, sea weeds, olive pulp, poultry manure and municipal solid waste. The seeds used in the germination bioassays were radish, pepper, spinach, tomato, cress, cucumber and lettuce. Data were analyzed with an analysis of variance at two levels and with pair-wise comparisons. The analysis revealed that composts rendered as phytotoxic to one type of seed could enhance the growth of another type of seed. Therefore, germination indices, which ranged from 0% to 262%, were highly dependent on the type of seed used in the germination bioassay. The poultry manure compost was highly phytotoxic to all seeds. At the 99% confidence level, the type of seed and the interaction between the seeds and the composts were found to significantly affect germination. In addition, the stability of composts was assessed by their microbial respiration, which ranged from approximately 4 to 16g O(2)/kg organic matter and from 2.6 to approximately 11g CO(2)-C/kg C, after seven days. Initial average oxygen uptake rates were all less than approximately 0.35g O(2)/kg organic matter/h for all six composts. A high statistically significant correlation coefficient was calculated between the cumulative carbon dioxide production, over a 7-day period, and the radish seed germination index. It appears that a germination bioassay with radish can be a valid test to assess both compost stability and compost phytotoxicity.

  16. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Rankin...

  17. Aging assessment of surge protective devices in nuclear power plants

    International Nuclear Information System (INIS)

    Davis, J.F.; Subudhi, M.; Carroll, D.P.

    1996-01-01

    An assessment was performed to determine the effects of aging on the performance and availability of surge protective devices (SPDs), used in electrical power and control systems in nuclear power plants. Although SPDs have not been classified as safety-related, they are risk-important because they can minimize the initiating event frequencies associated with loss of offsite power and reactor trips. Conversely, their failure due to age might cause some of those initiating events, e.g., through short circuit failure modes, or by allowing deterioration of the safety-related component(s) they are protecting from overvoltages, perhaps preventing a reactor trip, from an open circuit failure mode. From the data evaluated during 1980--1994, it was found that failures of surge arresters and suppressers by short circuits were neither a significant risk nor safety concern, and there were no failures of surge suppressers preventing a reactor trip. Simulations, using the ElectroMagnetic Transients Program (EMTP) were performed to determine the adequacy of high voltage surge arresters

  18. Cost/benefit assessment in electric power systems

    International Nuclear Information System (INIS)

    Oteng-Adjei, J.

    1990-01-01

    The basic function of a modern power system is to satisfy the system load requirements as economically as possible and with a reasonable assurance of continuity and quality. The question of what is reasonable can be examined in terms of the costs and the worth to the consumer associated with providing an adequate supply. The process of preparing reliability worth estimates based on customer cost-of-interruption data is presented. These data can be derived for a particular utility service area and are used to determine appropriate customer damage functions. These indicators can be used with the basic loss of energy expectation (LOEE) index to obtain a factor that can be utilized to relate the customer losses to the worth of electric service reliability. This factor is designated as the interrupted energy assessment rate (IEAR). The developed IEAR values can be utilized in both generating capacity and composite generation and transmission system assessment. Methods for using these estimates in power system optimization at the planning stages are described and examples are used to illustrate the procedures. 106 refs., 77 figs., 64 tabs

  19. Assessment of electrical equipment aging for nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The electrical and instrumentation equipments, especially whose parts are made of polymer material, are gradually degraded by thermal and radiation environment in the normal operation, and the degradation is thought to progress rapidly when they are exposed to the environment of the design basis event (DBE). The integrity of the equipments is evaluated by the environmental qualification (EQ) test simulating the environment of the normal operation and the DBE. The project of 'Assessment of Cable Aging for Nuclear Power Plants' (ACA, 2002-2008) indicated the importance of applying simultaneous thermal and radiation aging for simulating the aging in normal operation. The project of 'Assessment of Electrical Equipment Aging for Nuclear Power Plants' (AEA) was initiated in FY2008 to apply the outcome of ACA to the other electrical and instrumentation equipment and to establish an advanced EQ test method that can appropriately simulate the environment in actual plants. In FY2012, aging characteristics of thermal aging and simultaneous aging were obtained for the epoxy resin of electrical penetrations and the O-ring of connectors. Physical property measurement was carried out for epoxy resin of electrical penetration subject to the type testing in FY2010. (author)

  20. Assessment of electrical equipment aging for nuclear power plant

    International Nuclear Information System (INIS)

    2013-01-01

    The electrical and instrumentation equipments, especially whose parts are made of polymer material, are gradually degraded by thermal and radiation environment in the normal operation, and the degradation is thought to progress rapidly when they are exposed to the environment of the design basis event (DBE). The integrity of the equipments is evaluated by the environmental qualification (EQ) test simulating the environment of the normal operation and the DBE. The project of 'Assessment of Cable Aging for Nuclear Power Plants' (ACA, 2002-2008) indicated the importance of applying simultaneous thermal and radiation aging for simulating the aging in normal operation. The project of 'Assessment of Electrical Equipment Aging for Nuclear Power Plants' (AEA) was initiated in FY2008 to apply the outcome of ACA to the other electrical and instrumentation equipment and to establish an advanced EQ test method that can appropriately simulate the environment in actual plants. In FY2012, aging characteristics of thermal aging and simultaneous aging were obtained for the epoxy resin of electrical penetrations and the O-ring of connectors. Physical property measurement was carried out for epoxy resin of electrical penetration subject to the type testing in FY2010. (author)

  1. Force-Velocity-Power Assessment in Semiprofessional Rugby Union Players.

    Science.gov (United States)

    McMaster, Daniel T; Gill, Nicholas D; Cronin, John B; McGuigan, Michael R

    2016-04-01

    There is a constant and necessary evolution of training and assessment methods in the elite contact sports; as is required to continually improve the physical qualities of these respective athletes to match the growing sport and position-specific performance demands. Our aim was to examine the differences between ballistic upper body performance profiles and maximum upper body strength of elite rugby union forwards and backs. Twenty semiprofessional male rugby union players (age = 21.1 ± 3.0 years; mass = 94.9 ± 9.7 kg) were assessed for maximum bench press strength (1RM bench press = 121.3 ± 21.8 kg) and maximum throw power (Pmax), force (Fmax), and velocity (V[Combining Dot Above]max) from an incremental relative load testing protocol (15, 30, 45, 60, and 75% 1RM). Player rankings were also included to identify individual strength and weaknesses. The forwards were moderately stronger (effect size [ES] = 0.96; p = 0.01), produced significantly greater Fmax (ES = 1.17-1.41; p = 0.01) and were more powerful (ES = 0.57-0.64; p 0.15). There were inherent differences in strength and Fmax between the forwards and backs most likely because of the physical demands of these respective positions. Improvements in upper body strength may in turn improve ballistic force and power production, but not necessarily velocity capabilities. From the Fmax and V[Combining Dot Above]max observations, the forwards seem to be more force dominant and the backs more velocity dominant. Pmax, Fmax, and V[Combining Dot Above]max may be used to highlight proficient and deficient areas in ballistic upper body performance; the individual rankings could be further used to identify and possibly rectify individual deficiencies.

  2. Use assessment of electronic power sources for SMAW

    Directory of Open Access Journals (Sweden)

    Scotti, A.

    1999-04-01

    Full Text Available The aim of the present work was to assess the efficacy of the use of modern technologies for power supplies in Shielded Metal Are Welding (SMAW. Coupon tests were welded by using a series of five different classes of commercial electrodes, covering their current ranges. Both a conventional electromagnetic and an electronic (inverter power sources were employed. Fusion rate, deposition efficiency, bead finish and weld geometry were measured at each experiment. Current and voltage signals were acquired at a high rate to evaluate the dynamic behavior of the power sources. The static performances of both power sources were also determined. The results showed that despite the remarkable differences between the power supplies, based on static and dynamic characterizations, no significant difference was noticed in the operational behavior of the electrodes, in the given conditions, apart from a better anti-stick performance obtained with the electronic power source.

    El objetivo del presente trabajo fue evaluar la eficacia del uso de tecnologías modernas para fuentes de energía en soldaduras con electrodo revestido (Shielded Metal Are Welding -SMAW-. Los materiales de ensayo se soldaron usando una serie de cinco clases diferentes de electrodos comerciales, cubriendo sus rangos de corriente. Para esto se utilizó una fuente de energía electromagnética convencional y una fuente de energía electrónica (inversora. La tasa de fusión, eficiencia de deposición, terminación del cordón así como el diseño de la soldadura se midieron en cada experimento. Las señales de corriente y voltaje se obtuvieron a una proporción alta para evaluar el comportamiento dinámico de las fuentes de energía. También se determinó la actuación estática de ambas fuentes. Los resultados mostraron que a pesar de las diferencias notables entre los suministros de energía, no se nota diferencia alguna significante en la conducta de trabajo de los electrodos, en

  3. Using a statistical process control chart during the quality assessment of cancer registry data.

    Science.gov (United States)

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  4. The use of test scores from large-scale assessment surveys: psychometric and statistical considerations

    Directory of Open Access Journals (Sweden)

    Henry Braun

    2017-11-01

    Full Text Available Abstract Background Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT or ACT. These differences have important implications both for utilization and interpretation. Although much has been written about PVs, it appears that there are still misconceptions about whether and how to employ them in secondary analyses. Methods We address a range of technical issues, including those raised in a recent article that was written to inform economists using these databases. First, an extensive review of the relevant literature was conducted, with particular attention to key publications that describe the derivation and psychometric characteristics of such achievement measures. Second, a simulation study was carried out to compare the statistical properties of estimates based on the use of PVs with those based on other, commonly used methods. Results It is shown, through both theoretical analysis and simulation, that under fairly general conditions appropriate use of PV yields approximately unbiased estimates of model parameters in regression analyses of large scale survey data. The superiority of the PV methodology is particularly evident when measures of student achievement are employed as explanatory variables. Conclusions The PV methodology used to report student test performance in large scale surveys remains the state-of-the-art for secondary analyses of these databases.

  5. Communicating Conservation Status: How Different Statistical Assessment Criteria Affect Perceptions of Extinction Risk.

    Science.gov (United States)

    Song, Hwanseok; Schuldt, Jonathon P

    2017-09-01

    Although alternative forms of statistical and verbal information are routinely used to convey species' extinction risk to policymakers and the public, little is known about their effects on audience information processing and risk perceptions. To address this gap in literature, we report on an experiment that was designed to explore how perceptions of extinction risk differ as a function of five different assessment benchmarks (Criteria A-E) used by scientists to classify species within IUCN Red List risk levels (e.g., Critically Endangered, Vulnerable), as well as the role of key individual differences in these effects (e.g., rational and experiential thinking styles, environmental concern). Despite their normative equivalence within the IUCN classification system, results revealed divergent effects of specific assessment criteria: on average, describing extinction risk in terms of proportional population decline over time (Criterion A) and number of remaining individuals (Criterion D) evoked the highest level of perceived risk, whereas the single-event probability of a species becoming extinct (Criterion E) engendered the least perceived risk. Furthermore, participants scoring high in rationality (analytic thinking) were less prone to exhibit these biases compared to those low in rationality. Our findings suggest that despite their equivalence in the eyes of scientific experts, IUCN criteria are indeed capable of engendering different levels of risk perception among lay audiences, effects that carry direct and important implications for those tasked with communicating about conservation status to diverse publics. © 2016 Society for Risk Analysis.

  6. Assessing the economic wind power potential in Austria

    International Nuclear Information System (INIS)

    Gass, Viktoria; Schmidt, Johannes; Strauss, Franziska; Schmid, Erwin

    2013-01-01

    In the European Union, electricity production from wind energy is projected to increase by approximately 16% until 2020. The Austrian energy plan aims at increasing the currently installed wind power capacity from approximately 1 GW to 3 GW until 2020 including an additional capacity of 700 MW until 2015. The aim of this analysis is to assess economically viable wind turbine sites under current feed-in tariffs considering constraints imposed by infrastructure, the natural environment and ecological preservation zones in Austria. We analyze whether the policy target of installing an additional wind power capacity of 700 MW until 2015 is attainable under current legislation and developed a GIS based decision system for wind turbine site selection.Results show that the current feed-in tariff of 9.7 ct kW h −1 may trigger an additional installation of 3544 MW. The current feed-in tariff can therefore be considered too high as wind power deployment would exceed the target by far. Our results indicate that the targets may be attained more cost-effectively by applying a lower feed-in tariff of 9.1 ct kW h −1 . Thus, windfall profits at favorable sites and deadweight losses of policy intervention can be minimized while still guaranteeing the deployment of additional wind power capacities. - Highlight: ► Wind supply curves with high spatial resolution for whole Austria are derived. ► Current feed-in tariff higher than necessary to attain targets. ► Previous feed-in tariffs were too low to achieve targets. ► Current support scheme leads to high social welfare losses. ► Policy makers face high information asymmetry when setting feed-in tariffs.

  7. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature

    OpenAIRE

    Szucs, Denes; Ioannidis, JPA

    2017-01-01

    Author summary Biomedical science, psychology, and many other fields may be suffering from a serious replication crisis. In order to gain insight into some factors behind this crisis, we have analyzed statistical information extracted from thousands of cognitive neuroscience and psychology research papers. We established that the statistical power to discover existing relationships has not improved during the past half century. A consequence of low statistical power is that research studies a...

  8. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  9. A Fractional Lower Order Statistics-Based MIMO Detection Method in Impulse Noise for Power Line Channel

    Directory of Open Access Journals (Sweden)

    CHEN, Z.

    2014-11-01

    Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.

  10. Reporting characteristics of meta-analyses in orthodontics: methodological assessment and statistical recommendations.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E

    2014-02-01

    Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

  11. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies

    Science.gov (United States)

    Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert

    2016-01-01

    The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471

  12. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    Science.gov (United States)

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  13. Assessment of Problem-Based Learning in the Undergraduate Statistics Course

    Science.gov (United States)

    Karpiak, Christie P.

    2011-01-01

    Undergraduate psychology majors (N = 51) at a mid-sized private university took a statistics examination on the first day of the research methods course, a course for which a grade of "C" or higher in statistics is a prerequisite. Students who had taken a problem-based learning (PBL) section of the statistics course (n = 15) were compared to those…

  14. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  15. Application of statistical and dynamics models for snow avalanche hazard assessment in mountain regions of Russia

    Science.gov (United States)

    Turchaninova, A.

    2012-04-01

    The estimation of extreme avalanche runout distances, flow velocities, impact pressures and volumes is an essential part of snow engineering in mountain regions of Russia. It implies the avalanche hazard assessment and mapping. Russian guidelines accept the application of different avalanche models as well as approaches for the estimation of model input parameters. Consequently different teams of engineers in Russia apply various dynamics and statistical models for engineering practice. However it gives more freedom to avalanche practitioners and experts but causes lots of uncertainties in case of serious limitations of avalanche models. We discuss these problems by presenting the application results of different well known and widely used statistical (developed in Russia) and avalanche dynamics models for several avalanche test sites in the Khibini Mountains (The Kola Peninsula) and the Caucasus. The most accurate and well-documented data from different powder and wet, big rare and small frequent snow avalanche events is collected from 1960th till today in the Khibini Mountains by the Avalanche Safety Center of "Apatit". This data was digitized and is available for use and analysis. Then the detailed digital avalanche database (GIS) was created for the first time. It contains contours of observed avalanches (ESRI shapes, more than 50 years of observations), DEMs, remote sensing data, description of snow pits, photos etc. Thus, the Russian avalanche data is a unique source of information for understanding of an avalanche flow rheology and the future development and calibration of the avalanche dynamics models. GIS database was used to analyze model input parameters and to calibrate and verify avalanche models. Regarding extreme dynamic parameters the outputs using different models can differ significantly. This is unacceptable for the engineering purposes in case of the absence of the well-defined guidelines in Russia. The frequency curves for the runout distance

  16. Preliminary environmental assessment for the Satellite Power System (SPS). Revision 1. Volume 2. Detailed assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The Department of Energy (DOE) is considering several options for generating electrical power to meet future energy needs. The satellite power system (SPS), one of these options, would collect solar energy through a system of satellites in space and transfer this energy to earth. A reference system has been described that would convert the energy to microwaves and transmit the microwave energy via directive antennas to large receiving/rectifying antennas (rectennas) located on the earth. At the rectennas, the microwave energy would be converted into electricity. The potential environmental impacts of constructing and operating the satellite power system are being assessed as a part of the Department of Energy's SPS Concept Development and Evaluation Program. This report is Revision I of the Preliminary Environmental Assessment for the Satellite Power System published in October 1978. It refines and extends the 1978 assessment and provides a basis for a 1980 revision that will guide and support DOE recommendations regarding future SPS development. This is Volume 2 of two volumes. It contains the technical detail suitable for peer review and integrates information appearing in documents referenced herein. The key environmental issues associated with the SPS concern human health and safety, ecosystems, climate, and electromagnetic systems interactions. In order to address these issues in an organized manner, five tasks are reported: (I) microwave-radiation health and ecological effects; (II) nonmicrowave health and ecological effectss; (III) atmospheric effects; (IV) effects on communication systems due to ionospheric disturbance; and (V) electromagnetic compatibility. (WHK)

  17. Direct integration of intensity-level data from Affymetrix and Illumina microarrays improves statistical power for robust reanalysis

    Directory of Open Access Journals (Sweden)

    Turnbull Arran K

    2012-08-01

    Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.

  18. Journal of EEA, Vol. 31, 2014 10 POWER QUALITY ASSESSMENT ...

    African Journals Online (AJOL)

    ABSTRACT. In this paper, electric power quality (PQ) in Walya- ... Commercial AC electric power systems are to operate at a sinusoidal ... Electric Power System in the two Factories. EPSC is ..... factor and reliability for industries. The design ...

  19. Network Theory Integrated Life Cycle Assessment for an Electric Power System

    Directory of Open Access Journals (Sweden)

    Heetae Kim

    2015-08-01

    Full Text Available In this study, we allocate Greenhouse gas (GHG emissions of electricity transmission to the consumers. As an allocation basis, we introduce energy distance. Energy distance takes the transmission load on the electricity energy system into account in addition to the amount of electricity consumption. As a case study, we estimate regional GHG emissions of electricity transmission loss in Chile. Life cycle assessment (LCA is used to estimate the total GHG emissions of the Chilean electric power system. The regional GHG emission of transmission loss is calculated from the total GHG emissions. We construct the network model of Chilean electric power grid as an undirected network with 466 nodes and 543 edges holding the topology of the power grid based on the statistical record. We analyze the total annual GHG emissions of the Chilean electricity energy system as 23.07 Mt CO2-eq. and 1.61 Mt CO2-eq. for the transmission loss, respectively. The total energy distance for the electricity transmission accounts for 12,842.10 TWh km based on network analysis. We argue that when the GHG emission of electricity transmission loss is estimated, the electricity transmission load should be separately considered. We propose network theory as a useful complement to LCA analysis for the complex allocation. Energy distance is especially useful on a very large-scale electric power grid such as an intercontinental transmission network.

  20. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Zwanziger, P.

    2000-01-01

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfill special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfill these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes. (author)