WorldWideScience

Sample records for assessment statistical power

  1. Automated FMV image quality assessment based on power spectrum statistics

    Science.gov (United States)

    Kalukin, Andrew

    2015-05-01

    Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).

  2. Longitudinal Assessment Design and Statistical Power for Detecting an Intervention Impact.

    Science.gov (United States)

    Petras, Hanno

    2016-10-01

    In evaluating randomized control trials (RCTs), statistical power analyses are necessary to choose a sample size which strikes the balance between an insufficient and an excessive design, with the latter leading to misspent resources. With the growing popularity of using longitudinal data to evaluate RCTs, statistical power calculations have become more complex. Specifically, with repeated measures, the number and frequency of measurements per person additionally influence statistical power by determining the precision with which intra-individual change can be measured as well as the reliability with which inter-individual differences in change can be assessed. The application of growth mixture models has shown that the impact of universal interventions is often concentrated among a small group of individuals at the highest level of risk. General sample size calculations were consequently not sufficient to determine whether statistical power is adequate to detect the desired effect. Currently, little guidance exists to recommend a sufficient assessment design to evaluating intervention impact. To this end, Monte Carlo simulations are conducted to assess the statistical power and precision when manipulating study duration and assessment frequency. Estimates were extracted from a published evaluation of the proximal of the Good Behavior Game (GBG) on the developmental course of aggressive behavior. Results indicated that the number of time points and the frequency of assessments influence statistical power and precision. Recommendations for the assessment design of longitudinal studies are discussed.

  3. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Directory of Open Access Journals (Sweden)

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  4. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Directory of Open Access Journals (Sweden)

    Frantál Bohumil

    2016-03-01

    Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.

  5. DISTRIBUTED GRID-CONNECTED PHOTOVOLTAIC POWER SYSTEM EMISSION OFFSET ASSESSMENT: STATISTICAL TEST OF SIMULATED- AND MEASURED-BASED DATA

    Science.gov (United States)

    This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...

  6. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  7. NONSTRUCTURAL AND STATISTICAL NONPARAMETRIC MARKET POWER TESTS: AN EMPIRICAL INVESTIGATION

    OpenAIRE

    Noelke, Corinna M.; Raper, Kellie Curry

    1999-01-01

    We use Monte Carlo experiments to assess the accuracy of two nonstructural and two statistical nonparametric market power tests. We implement these monopoly and monopsony market power tests using data from ten known market structures. The objective is to determine which test is most able to distinguish between market structures. The statistical nonparametric market power tests appear to be promising.

  8. Statistical modeling to support power system planning

    Science.gov (United States)

    Staid, Andrea

    This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate

  9. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Czech Academy of Sciences Publication Activity Database

    Frantál, Bohumil; Malý, Jiří; Ouředníček, M.; Nemeškal, J.

    2016-01-01

    Roč. 24, č. 1 (2016), s. 2-13 ISSN 1210-8812 R&D Projects: GA MŠk(CZ) EE2.3.20.0025 Institutional support: RVO:68145535 Keywords : nuclear power plant impacts * spatial analysis * risk perceptions Subject RIV: DE - Earth Magnetism, Geodesy, Geography Impact factor: 2.149, year: 2016 http://www.degruyter.com/view/j/mgr.2016.24.issue-1/mgr-2016-0001/mgr-2016-0001. xml ?format=INT

  10. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation

  11. Statistical Power in Plant Pathology Research.

    Science.gov (United States)

    Gent, David H; Esker, Paul D; Kriss, Alissa B

    2018-01-01

    In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even

  12. Relative linear power contribution with estimation statistics

    NARCIS (Netherlands)

    Lohnberg, P.

    1983-01-01

    The relative contribution by a noiselessly observed input signal to the power of a possibly disturbed observed stationary output signal from a linear system is expressed into signal spectral densities. Approximations of estimator statistics and derived confidence limits agree fairly well with

  13. Statistical baseline assessment in cardiotocography.

    Science.gov (United States)

    Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (Pbpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.

  14. Power and environmental assessment

    DEFF Research Database (Denmark)

    Cashmore, Matthew Asa; Richardson, Tim

    2013-01-01

    The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently...

  15. How Should We Assess the Fit of Rasch-Type Models? Approximating the Power of Goodness-of-Fit Statistics in Categorical Data Analysis

    Science.gov (United States)

    Maydeu-Olivares, Alberto; Montano, Rosa

    2013-01-01

    We investigate the performance of three statistics, R [subscript 1], R [subscript 2] (Glas in "Psychometrika" 53:525-546, 1988), and M [subscript 2] (Maydeu-Olivares & Joe in "J. Am. Stat. Assoc." 100:1009-1020, 2005, "Psychometrika" 71:713-732, 2006) to assess the overall fit of a one-parameter logistic model…

  16. EEI (Edison Electric Institute) power statistics sourcebook

    Energy Technology Data Exchange (ETDEWEB)

    1988-04-01

    Generated from the Edison Electric Institute's POWER STATISTICS Data Base, the Sourcebook is a comprehensive ''encyclopedia'' of US steam-electric plants containing design and selected operating data for more than 2500 units in the US and its territories. The Sourcebook includes eight looseleaf volumes of site and unit design data, one volume of customized indices (including boiler data, stack data, ash data, water data, sulfur dioxide control data, total suspended particulate data, company index, plant index, and geographic index), and three volumes covering one year (1986) of annual operating data. Operating data covering past years are available at an additional cost per operating year, beginning with 1980 and running through 1985, to purchasers of the Sourcebook. 1987 operating data will be available in mid-1988. The Sourcebook is priced on an annual subscription basis.

  17. Assessment and statistics of surgically induced astigmatism.

    Science.gov (United States)

    Naeser, Kristian

    2008-05-01

    The aim of the thesis was to develop methods for assessment of surgically induced astigmatism (SIA) in individual eyes, and in groups of eyes. The thesis is based on 12 peer-reviewed publications, published over a period of 16 years. In these publications older and contemporary literature was reviewed(1). A new method (the polar system) for analysis of SIA was developed. Multivariate statistical analysis of refractive data was described(2-4). Clinical validation studies were performed. The description of a cylinder surface with polar values and differential geometry was compared. The main results were: refractive data in the form of sphere, cylinder and axis may define an individual patient or data set, but are unsuited for mathematical and statistical analyses(1). The polar value system converts net astigmatisms to orthonormal components in dioptric space. A polar value is the difference in meridional power between two orthogonal meridians(5,6). Any pair of polar values, separated by an arch of 45 degrees, characterizes a net astigmatism completely(7). The two polar values represent the net curvital and net torsional power over the chosen meridian(8). The spherical component is described by the spherical equivalent power. Several clinical studies demonstrated the efficiency of multivariate statistical analysis of refractive data(4,9-11). Polar values and formal differential geometry describe astigmatic surfaces with similar concepts and mathematical functions(8). Other contemporary methods, such as Long's power matrix, Holladay's and Alpins' methods, Zernike(12) and Fourier analyses(8), are correlated to the polar value system. In conclusion, analysis of SIA should be performed with polar values or other contemporary component systems. The study was supported by Statens Sundhedsvidenskabeligt Forskningsråd, Cykelhandler P. Th. Rasmussen og Hustrus Mindelegat, Hotelejer Carl Larsen og Hustru Nicoline Larsens Mindelegat, Landsforeningen til Vaern om Synet

  18. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  19. Statistical assessment of biosimilar products.

    Science.gov (United States)

    Chow, Shein-Chung; Liu, Jen-Pei

    2010-01-01

    Biological products or medicines are therapeutic agents that are produced using a living system or organism. Access to these life-saving biological products is limited because of their expensive costs. Patents on the early biological products will soon expire in the next few years. This allows other biopharmaceutical/biotech companies to manufacture the generic versions of the biological products, which are referred to as follow-on biological products by the U.S. Food and Drug Administration (FDA) or as biosimilar medicinal products by the European Medicine Agency (EMEA) of the European Union (EU). Competition of cost-effective follow-on biological products with equivalent efficacy and safety can cut down the costs and hence increase patients' access to the much-needed biological pharmaceuticals. Unlike for the conventional pharmaceuticals of small molecules, the complexity and heterogeneity of the molecular structure, complicated manufacturing process, different analytical methods, and possibility of severe immunogenicity reactions make evaluation of equivalence (similarity) between the biosimilar products and their corresponding innovator product a great challenge for both the scientific community and regulatory agencies. In this paper, we provide an overview of the current regulatory requirements for approval of biosimilar products. A review of current criteria for evaluation of bioequivalence for the traditional chemical generic products is provided. A detailed description of the differences between the biosimilar and chemical generic products is given with respect to size and structure, immunogenicity, product quality attributed, and manufacturing processes. In addition, statistical considerations including design criteria, fundamental biosimilar assumptions, and statistical methods are proposed. The possibility of using genomic data in evaluation of biosimilar products is also explored.

  20. Statistical power analysis for the behavioral sciences

    National Research Council Canada - National Science Library

    Cohen, Jacob

    1988-01-01

    .... A chapter has been added for power analysis in set correlation and multivariate methods (Chapter 10). Set correlation is a realization of the multivariate general linear model, and incorporates the standard multivariate methods...

  1. Statistical power analysis for the behavioral sciences

    National Research Council Canada - National Science Library

    Cohen, Jacob

    1988-01-01

    ... offers a unifying framework and some new data-analytic possibilities. 2. A new chapter (Chapter 11) considers some general topics in power analysis in more integrted form than is possible in the earlier...

  2. Statistical Analysis of Loss of Offsite Power Events

    Directory of Open Access Journals (Sweden)

    Andrija Volkanovski

    2016-01-01

    Full Text Available This paper presents the results of the statistical analysis of the loss of offsite power events (LOOP registered in four reviewed databases. The reviewed databases include the IRSN (Institut de Radioprotection et de Sûreté Nucléaire SAPIDE database and the GRS (Gesellschaft für Anlagen- und Reaktorsicherheit mbH VERA database reviewed over the period from 1992 to 2011. The US NRC (Nuclear Regulatory Commission Licensee Event Reports (LERs database and the IAEA International Reporting System (IRS database were screened for relevant events registered over the period from 1990 to 2013. The number of LOOP events in each year in the analysed period and mode of operation are assessed during the screening. The LOOP frequencies obtained for the French and German nuclear power plants (NPPs during critical operation are of the same order of magnitude with the plant related events as a dominant contributor. A frequency of one LOOP event per shutdown year is obtained for German NPPs in shutdown mode of operation. For the US NPPs, the obtained LOOP frequency for critical and shutdown mode is comparable to the one assessed in NUREG/CR-6890. Decreasing trend is obtained for the LOOP events registered in three databases (IRSN, GRS, and NRC.

  3. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  4. The power and statistical behaviour of allele-sharing statistics when ...

    Indian Academy of Sciences (India)

    Unknown

    , using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single ...

  5. Sample Size and Statistical Power Calculation in Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Eun Pyo Hong

    2012-06-01

    Full Text Available A sample size with sufficient statistical power is critical to the success of genetic association studies to detect causal genes of human complex diseases. Genome-wide association studies require much larger sample sizes to achieve an adequate statistical power. We estimated the statistical power with increasing numbers of markers analyzed and compared the sample sizes that were required in case-control studies and case-parent studies. We computed the effective sample size and statistical power using Genetic Power Calculator. An analysis using a larger number of markers requires a larger sample size. Testing a single-nucleotide polymorphism (SNP marker requires 248 cases, while testing 500,000 SNPs and 1 million markers requires 1,206 cases and 1,255 cases, respectively, under the assumption of an odds ratio of 2, 5% disease prevalence, 5% minor allele frequency, complete linkage disequilibrium (LD, 1:1 case/control ratio, and a 5% error rate in an allelic test. Under a dominant model, a smaller sample size is required to achieve 80% power than other genetic models. We found that a much lower sample size was required with a strong effect size, common SNP, and increased LD. In addition, studying a common disease in a case-control study of a 1:4 case-control ratio is one way to achieve higher statistical power. We also found that case-parent studies require more samples than case-control studies. Although we have not covered all plausible cases in study design, the estimates of sample size and statistical power computed under various assumptions in this study may be useful to determine the sample size in designing a population-based genetic association study.

  6. Statistical Assessment Of Critical Factors Affecting Computer ...

    African Journals Online (AJOL)

    This paper reports the findings of a statistical assessment of some factors affecting computer application in information resource sharing. Secondary and primary data were employed in this study. Quota and random sampling were used to administer questionnaires using Interviewer -administered methodology of data ...

  7. New Dynamical-Statistical Techniques for Wind Power Prediction

    Science.gov (United States)

    Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.

    2012-04-01

    The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.

  8. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  9. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  10. The power and robustness of maximum LOD score statistics.

    Science.gov (United States)

    Yoo, Y J; Mendell, N R

    2008-07-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value. In this paper, we compare the power of maximum LOD scores based on three fixed sets of genetic parameter values with the power of the LOD score obtained after maximizing over the entire range of genetic parameter values. We simulate family data under nine generating models. For generating models with non-zero phenocopy rates, LOD scores maximized over the entire range of genetic parameters yielded greater power than maximum LOD scores for fixed sets of parameter values with zero phenocopy rates. No maximum LOD score was consistently more powerful than the others for generating models with a zero phenocopy rate. The power loss of the LOD score maximized over the entire range of genetic parameters, relative to the maximum LOD score calculated using the correct genetic parameter value, appeared to be robust to the generating models.

  11. Statistical Modeling of Soi Devices for Low-Power Electronics.

    Science.gov (United States)

    Phelps, Mark Joseph

    1995-01-01

    This dissertation addresses the needs of low-power, large-scale integrated circuit device design, advanced materials technology, and computer simulation for statistical modeling. The main body of work comprises the creation and implementation of a software shell (STADIUM-SOI) that automates the application of statistics to commercial technology computer-aided design tools. The objective is to demonstrate that statistical design of experiments methodology can be employed for the advanced material technology of Silicon -On-Insulator (SOI) devices. The culmination of this effort was the successful modeling of the effect of manufacturing process variation on SOI device characteristics and the automation of this procedure.

  12. Harnessing the power of civil registration and vital statistics systems ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2018-02-27

    Harnessing the power of civil registration and vital statistics systems. A doctor speaks to a woman with her baby in Afghanistan. Graham Crouch/World Bank. Event date: February 27, 2018 to February 28, 2018. Location: IDRC. 150 Kent Street, 8th floor. Ottawa, ON. Canada. Time: 9:00am - 5:45pm. This conference will ...

  13. Robust Statistical Detection of Power-Law Cross-Correlation

    Science.gov (United States)

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-06-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  14. Statistical analyses support power law distributions found in neuronal avalanches.

    Directory of Open Access Journals (Sweden)

    Andreas Klaus

    Full Text Available The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii model parameter estimation to determine the specific exponent of the power law, and (iii comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect. This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

  15. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  16. Design sensitivity and statistical power in acceptability judgment experiments

    Directory of Open Access Journals (Sweden)

    Jon Sprouse

    2017-02-01

    Full Text Available Previous investigations into the validity of acceptability judgment data have focused almost exclusively on 'type I errors '(or false positives because of the consequences of such errors for syntactic theories (Sprouse & Almeida 2012; Sprouse et al. 2013. The current study complements these previous studies by systematically investigating the 'type II error rate '(false negatives, or equivalently, the 'statistical power', of a wide cross-section of possible acceptability judgment experiments. Though type II errors have historically been assumed to be less costly than type I errors, the dynamics of scientific publishing mean that high type II error rates (i.e., studies with low statistical power can lead to increases in type I error rates in a given field of study. We present a set of experiments and resampling simulations to estimate statistical power for four tasks (forced-choice, Likert scale, magnitude estimation, and yes-no, 50 effect sizes instantiated by real phenomena, sample sizes from 5 to 100 participants, and two approaches to statistical analysis (null hypothesis and Bayesian. Our goals are twofold (i to provide a fuller picture of the status of acceptability judgment data in syntax, and (ii to provide detailed information that syntacticians can use to design and evaluate the sensitivity of acceptability judgment experiments in their own research.

  17. When Mathematics and Statistics Collide in Assessment Tasks

    Science.gov (United States)

    Bargagliotti, Anna; Groth, Randall

    2016-01-01

    Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.

  18. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  19. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    of internal factors to the blend e.g. the particle size distribution. The relation between particle size distribution and the variation in drug content in blend and tablet samples is discussed. A central problem is to develop acceptance criteria for blends and tablet batches to decide whether the blend......In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials....... Some methods have a focus on exploratory analysis where the aim is to investigate the spatial distribution of drug content in the batch. Other methods presented focus on describing the overall (total) (in)homogeneity of the blend. The overall (in)homogeneity of the blend is relevant as it is closely...

  20. Statistical aspects of fish stock assessment

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte

    Fish stock assessments are conducted for two main purposes: 1) To estimate past and present fish abundances and their commercial exploitation rates. 2) To predict the consequences of different management strategies in order to ensure a sustainable fishery in the future. This thesis concerns...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods...... on stochastic differential equations is presented. This work extends the classical approaches to biomass modelling by incorporating observation errors on the catches, and allowing for missing and non-equidistant samples in time....

  1. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  2. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  3. Assessing Statistical Model Assumptions under Climate Change

    Science.gov (United States)

    Varotsos, Konstantinos V.; Giannakopoulos, Christos; Tombrou, Maria

    2016-04-01

    The majority of the studies assesses climate change impacts on air-quality using chemical transport models coupled to climate ones in an off-line mode, for various horizontal resolutions and different present and future time slices. A complementary approach is based on present-day empirical relations between air-pollutants and various meteorological variables which are then extrapolated to the future. However, the extrapolation relies on various assumptions such as that these relationships will retain their main characteristics in the future. In this study we focus on the ozone-temperature relationship. It is well known that among a number of meteorological variables, temperature is found to exhibit the highest correlation with ozone concentrations. This has led, in the past years, to the development and application of statistical models with which the potential impact of increasing future temperatures on various ozone statistical targets was examined. To examine whether the ozone-temperature relationship retains its main characteristics under warmer temperatures we analyze the relationship during the heatwaves events of 2003 and 2006 in Europe. More specifically, we use available gridded daily maximum temperatures (E-OBS) and hourly ozone observations from different non-urban stations (EMEP) within the areas that were impacted from the two heatwave events. In addition, we compare the temperature distributions of the two events with temperatures from two different future time periods 2021-2050 and 2071-2100 from a number of regional climate models developed under the framework of the Cordex initiative (http://www.cordex.org) with a horizontal resolution of 12 x 12km, based on different IPCC RCPs emissions scenarios. A statistical analysis is performed on the ozone-temperature relationship for each station and for the two aforementioned years which are then compared against the ozone-temperature relationships obtained from the rest of the available dataseries. The

  4. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  5. Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows

    DEFF Research Database (Denmark)

    Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco

    2012-01-01

    In view of the increasing penetration of wind power in a number of power systems and markets worldwide, we discuss some of the impacts that wind energy may have on market quantities and cross-border power flows. These impacts are uncovered through statistical analyses of actual market and flow data...... in Europe. Due to the dimensionality and nonlinearity of these effects, the necessary concepts of dimension reduction using Principal Component Analysis (PCA), as well as nonlinear regression are described. Example application results are given for European cross-border flows, as well as for the impact...... of load and wind power forecasts on Danish and German electricity markets....

  6. In vivo Comet assay – statistical analysis and power calculations of mice testicular cells

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne

    2014-01-01

    . A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most...... consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells....... statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim...

  7. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  8. Prediction of lacking control power in power plants using statistical models

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob

    2007-01-01

    errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance......Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...

  9. Power-law tailed statistical distributions and Lorentz transformations

    Energy Technology Data Exchange (ETDEWEB)

    Kaniadakis, G., E-mail: giorgio.kaniadakis@polito.i [Dipartimento di Fisica, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino (Italy)

    2011-01-17

    The present Letter, deals with the statistical theory [G. Kaniadakis, Phys. Rev. E 66 (2002) 056125; G. Kaniadakis, Phys. Rev. E 72 (2005) 036108], which predicts the probability distribution p(E){proportional_to}exp{sub {kappa}}(-I), where, I{proportional_to}{beta}E-{beta}{mu}, is the collision invariant, and exp{sub {kappa}}(x)=({radical}(1+{kappa}{sup 2}x{sup 2})+{kappa}x){sup 1}/{kappa}, with {kappa}{sup 2}<1. This, experimentally observed distribution, at low energies behaves as the Maxwell-Boltzmann exponential distribution, while at high energies presents power law tails. Here we show that the function exp{sub {kappa}}(x) and its inverse ln{sub {kappa}}(x), can be obtained within the one-particle relativistic dynamics, in a very simple and transparent way, without invoking any extra principle or assumption, starting directly from the Lorentz transformations. The achievements support the idea that the power law tailed distributions are enforced by the Lorentz relativistic microscopic dynamics, like in the case of the exponential distribution which follows from the Newton classical microscopic dynamics.

  10. Power-law tailed statistical distributions and Lorentz transformations

    Science.gov (United States)

    Kaniadakis, G.

    2011-01-01

    The present Letter, deals with the statistical theory [G. Kaniadakis, Phys. Rev. E 66 (2002) 056125; G. Kaniadakis, Phys. Rev. E 72 (2005) 036108], which predicts the probability distribution p(E)∝expκ(-I), where, I∝βE-βμ, is the collision invariant, and expκ(x)=(x+κx)1/κ, with κ<1. This, experimentally observed distribution, at low energies behaves as the Maxwell-Boltzmann exponential distribution, while at high energies presents power law tails. Here we show that the function expκ(x) and its inverse lnκ(x), can be obtained within the one-particle relativistic dynamics, in a very simple and transparent way, without invoking any extra principle or assumption, starting directly from the Lorentz transformations. The achievements support the idea that the power law tailed distributions are enforced by the Lorentz relativistic microscopic dynamics, like in the case of the exponential distribution which follows from the Newton classical microscopic dynamics.

  11. HVDC power transmission technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, R.L.; Tatro, P.J.; Railing, B.D. [New England Power Service Co., Westborough, MA (United States); Johnson, B.K.; Stewart, J.R. [Power Technologies, Inc., Schenectady, NY (United States); Fink, J.L.

    1997-04-01

    The purpose of this study was to develop an assessment of the national utility system`s needs for electric transmission during the period 1995-2020 that could be met by future reduced-cost HVDC systems. The assessment was to include an economic evaluation of HVDC as a means for meeting those needs as well as a comparison with competing technologies such as ac transmission with and without Flexible AC Transmission System (FACTS) controllers. The role of force commutated dc converters was to be assumed where appropriate. The assessment begins by identifying the general needs for transmission in the U.S. in the context of a future deregulated power industry. The possible roles for direct current transmission are then postulated in terms of representative scenarios. A few of the scenarios are illustrated with the help of actual U.S. system examples. non-traditional applications as well as traditional applications such as long lines and asynchronous interconnections are discussed. The classical ``break-even distance`` concept for comparing HVDC and ac lines is used to assess the selected scenarios. The impact of reduced-cost converters is reflected in terms of the break-even distance. This report presents a comprehensive review of the functional benefits of HVDC transmission and updated cost data for both ac and dc system components. It also provides some provocative thoughts on how direct current transmission might be applied to better utilize and expand our nation`s increasingly stressed transmission assets.

  12. Wind power statistics for Germany. Power from wind; Leistungsstatistik der Windkraftanlagen in Deutschland. Leistung aus Wind

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This is the German wind power statistics for January through April 2003. In all, 4,168 plants with a total capacity of 3,661.35 MW are covered, i.e. about one third of all German wind power plants. The statistics is incomplete as it is based on data that were submitted freely by wind power plant operators. Next to Windstats, it is the world's biggest wind power statistics. [German] In der folgenden Statistik werden die Leistungen der Monate Januar, Februar, Maerz und April 2003 vorgestellt. Es wurden fuer 4.168 Anlagen mit einer Gesamtleistung von 3.661,35 MW von Herstellern und Betreibern die Leistungsdaten gemeldet. Es sind nur die Anlagen in der folgenden Statistik vorgestellt, die ihre monatlichen Ertraege an die Ingenieur-Werkstatt Energietechnik melden. Die Anlagen ohne Leistungsmeldung sind nicht abgedruckt. Diese Statistik ist also unvollstaendig, nur etwa jede dritte Windkraftanlage in Deutschland ist erfasst. Es ist aber nach ''Windstats'' die umfangreichste Datensammlung ueber die Leistung von Windkraftanlagen, die weltweit besteht. (orig.)

  13. A Statistical Procedure for Assessing Test Dimensionality.

    Science.gov (United States)

    1984-03-09

    imagine applications in other fields. As an illustra- tion, suppose that medical subjects (the units) undergo allergy sensitivity tests to various...Inferences about the dimensionality of e become meaningful In attempting to develop a clas- sIfication scheme for allergies . Description of the Statistic A...Washington Computer Based Education Research Lab Alexandria, VA 22314 252 Engineering Research Laboratory Urbana, IL 61801 1 Dr. David J. kiss M60

  14. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S.; Antoniou, I.; Dahlberg, J.A. [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  15. Statistical Thinking Activities: Some Simple Exercises with Powerful Lessons

    Science.gov (United States)

    Melton, Kim I.

    2004-01-01

    Statistical thinking is required for good statistical analysis. Among other things, statistical thinking involves identifying sources of variation. Students in introductory statistics courses seldom recognize that one of the largest sources of variation may come in the collection and recording of the data. This paper presents some simple exercises…

  16. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  17. Testing statistical hypotheses based on the density power divergence

    National Research Council Canada - National Science Library

    Basu, A; Mandal, A; Martin, N; Pardo, L

    2013-01-01

    .... It is shown that the alternative test statistics proposed herein have asymptotic limits which are described by linear combinations of Chi-square statistics. Extensive simulation results are presented to substantiate the theory developed.

  18. The effect of cluster size variability on statistical power in cluster-randomized trials.

    Directory of Open Access Journals (Sweden)

    Stephen A Lauer

    Full Text Available The frequency of cluster-randomized trials (CRTs in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power.

  19. Environmental Assessment for power marketing policy for Southwestern Power Administration

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described.

  20. Power law statistics and stellar rotational velocities in the Pleiades

    Science.gov (United States)

    Carvalho, J. C.; Silva, R.; do Nascimento, J. D., Jr.; DeMedeiros, J. R.

    2008-12-01

    In this paper we will show that, the non-Gaussian statistics framework based on the Kaniadakis statistics is more appropriate to fit the observed distributions of projected rotational velocity measurements of stars in the Pleiades open cluster. To this end, we compare the results from the κ and q-distributions with the Maxwellian.

  1. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  2. Cross-Cultural Instrument Translation: Assessment, Translation, and Statistical Applications

    Science.gov (United States)

    Mason, Teresa Crowe

    2005-01-01

    This article has four major sections: (a) general issues of assessment; (b) assessment of ethnic-group members, including those who are deaf; (c) translation of assessment tools, emphasizing translation into American Sign Language (ASL); and (d) statistical applications for translated instruments. The purpose of the article is to provide insight…

  3. Statistical Power Analysis with Missing Data A Structural Equation Modeling Approach

    CERN Document Server

    Davey, Adam

    2009-01-01

    Statistical power analysis has revolutionized the ways in which we conduct and evaluate research.  Similar developments in the statistical analysis of incomplete (missing) data are gaining more widespread applications. This volume brings statistical power and incomplete data together under a common framework, in a way that is readily accessible to those with only an introductory familiarity with structural equation modeling.  It answers many practical questions such as: How missing data affects the statistical power in a study How much power is likely with different amounts and types

  4. An Assessment Model for Improving Student Learning of Statistics

    Science.gov (United States)

    Kasonga, R. A.; Corbett, A. D.

    2008-01-01

    Statistical reasoning, thinking and literacy have repeatedly been mentioned in the literature as important goals of statistics education. Many suggestions have been made on how to achieve these goals, with the focus on various aspects of the teaching and learning environment. In this article we propose an assessment model that targets student…

  5. Evaluating the statistical power of DNA-based identification, exemplified by 'The missing grandchildren of Argentina'.

    Science.gov (United States)

    Kling, Daniel; Egeland, Thore; Piñero, Mariana Herrera; Vigeland, Magnus Dehli

    2017-08-12

    Methods and implementations of DNA-based identification are well established in several forensic contexts. However, assessing the statistical power of these methods has been largely overlooked, except in the simplest cases. In this paper we outline general methods for such power evaluation, and apply them to a large set of family reunification cases, where the objective is to decide whether a person of interest (POI) is identical to the missing person (MP) in a family, based on the DNA profile of the POI and available family members. As such, this application closely resembles database searching and disaster victim identification (DVI). If parents or children of the MP are available, they will typically provide sufficient statistical evidence to settle the case. However, if one must resort to more distant relatives, it is not a priori obvious that a reliable conclusion is likely to be reached. In these cases power evaluation can be highly valuable, for instance in the recruitment of additional family members. To assess the power in an identification case, we advocate the combined use of two statistics: the Probability of Exclusion, and the Probability of Exceedance. The former is the probability that the genotypes of a random, unrelated person are incompatible with the available family data. If this is close to 1, it is likely that a conclusion will be achieved regarding general relatedness, but not necessarily the specific relationship. To evaluate the ability to recognize a true match, we use simulations to estimate exceedance probabilities, i.e. the probability that the likelihood ratio will exceed a given threshold, assuming that the POI is indeed the MP. All simulations are done conditionally on available family data. Such conditional simulations have a long history in medical linkage analysis, but to our knowledge this is the first systematic forensic genetics application. Also, for forensic markers mutations cannot be ignored and therefore current models and

  6. Caveats for using statistical significance tests in research assessments

    CERN Document Server

    Schneider, Jesper W

    2011-01-01

    This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We argue that applying statistical significance tests and mechanically adhering to their results is highly problematic and detrimental to critical thinki...

  7. statistical analysis of wind speed for electrical power generation in ...

    African Journals Online (AJOL)

    HOD

    are employed to fit wind speed data of some selected sites in Northern Nigeria. This is because the design of wind energy conversion systems depends on the correct analysis of the site renewable energy resources. [13]. In addition, the statistical judgements are based on the accuracy in fitting the available data at the sites.

  8. An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models

    Science.gov (United States)

    Prindle, John J.; McArdle, John J.

    2012-01-01

    This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…

  9. Wind power assessment in Uruguay

    Energy Technology Data Exchange (ETDEWEB)

    Cataldo, J. [Universidad de la Republica, Montevideo (Uruguay). Instituto de Mecanica de los Fluidos e Ingenieria Ambiental; Nunes, V. [Universidad de la Republica, Montevideo (Uruguay). Instituto de Ingeneria Electrica

    1996-09-01

    The wind power as a large alternative energy source appear in Uruguay. A nested method to obtain the mean wind velocity time series at complex terrain sites and describe the turbulence was developed. Sites with mean velocity over 9m/s and capacity factor over 40% were found. The aerodynamic interferences loss between wind generators using a numerical model were evaluated and a numerical model was developed to design an optimal cluster wind farm. As bulk result, an installed capacity of 300MW with a cost production less than 0.065U$S/kW.h can be estimated over the all studied Region. (author)

  10. Alternative Assessment in Higher Education: An Experience in Descriptive Statistics

    Science.gov (United States)

    Libman, Zipora

    2010-01-01

    Assessment-led reform is now one of the most widely favored strategies to promote higher standards of teaching, more powerful learning and more credible forms of public accountability. Within this context of change, higher education in many countries is increasingly subjected to demands to implement alternative assessment strategies that provide…

  11. Comparative environmental assessment of unconventional power installations

    Science.gov (United States)

    Sosnina, E. N.; Masleeva, O. V.; Kryukov, E. V.

    2015-08-01

    Procedure of the strategic environmental assessment of the power installations operating on the basis of renewable energy sources (RES) was developed and described. This procedure takes into account not only the operational process of the power installation but also the whole life cycles: from the production and distribution of power resources for manufacturing of the power installations to the process of their recovery. Such an approach gives an opportunity to make a more comprehensive assessment of the influence of the power installations on environments and may be used during adaptation of the current regulations and development of new regulations for application of different types of unconventional power installations with due account of the ecological factor. Application of the procedure of the integrated environmental assessment in the context of mini-HPP (Hydro Power Plant); wind, solar, and biogas power installations; and traditional power installation operating natural gas was considered. Comparison of environmental influence revealed advantages of new energy technologies compared to traditional ones. It is shown that solar energy installations hardly pollute the environment during operation, but the negative influence of the mining operations and manufacturing and utilization of the materials used for solar modules is maximum. Biogas power installations are on the second place as concerns the impact on the environment due to the considerable mass of the biogas installation and gas reciprocating engine. The minimum impact on the environment is exerted by the mini-HPP. Consumption of material and energy resources for the production of the traditional power installation is less compared to power installations on RES; however, this factor incomparably increases when taking into account the fuel extraction and transfer. The greatest impact on the environment is exerted by the operational process of the traditional power installations.

  12. In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.

    Science.gov (United States)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat

    2014-11-01

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Assessing Capacity Value of Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.

    2017-04-18

    This presentation provides a high-level overview of assessing capacity value of wind power, including Impacts of multiple-year data sets, impacts of transmission assumptions, and future research needs.

  14. Statistical Power of Psychological Research: What Have We Gained in 20 Years?

    Science.gov (United States)

    Rossi, Joseph S.

    1990-01-01

    Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…

  15. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly ...... are important or not. On the contrary their use may be harmful. Like many other critics, we generally believe that statistical significance tests are over- and misused in the empirical sciences including scientometrics and we encourage a reform on these matters.......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice...

  16. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  17. The experimental design of postmortem studies: the effect size and statistical power.

    Science.gov (United States)

    Meurs, Joris

    2016-09-01

    The aim is of this study was to show the poor statistical power of postmortem studies. Further, this study aimed to find an estimate of the effect size for postmortem studies in order to show the importance of this parameter. This can be an aid in performing power analysis to determine a minimal sample size. GPower was used to perform calculations on sample size, effect size, and statistical power. The minimal significance (α) and statistical power (1 - β) were set at 0.05 and 0.80 respectively. Calculations were performed for two groups (Student's t-distribution) and multiple groups (one-way ANOVA; F-distribution). In this study, an average effect size of 0.46 was found (n = 22; SD = 0.30). Using this value to calculate the statistical power of another group of postmortem studies (n = 5) revealed that the average statistical power of these studies was poor (1 - β studies is considerable. In order to enhance statistical power of postmortem studies, power analysis should be performed in which the effect size found in this study can be used as a guideline.

  18. Are the Nonparametric Person-Fit Statistics More Powerful than Their Parametric Counterparts? Revisiting the Simulations in Karabatsos (2003)

    Science.gov (United States)

    Sinharay, Sandip

    2017-01-01

    Karabatsos compared the power of 36 person-fit statistics using receiver operating characteristics curves and found the "H[superscript T]" statistic to be the most powerful in identifying aberrant examinees. He found three statistics, "C", "MCI", and "U3", to be the next most powerful. These four statistics,…

  19. GWAPower: a statistical power calculation software for genome-wide association studies with quantitative traits

    Directory of Open Access Journals (Sweden)

    Chen Chia-Cheng

    2011-01-01

    Full Text Available Abstract Background In designing genome-wide association (GWA studies it is important to calculate statistical power. General statistical power calculation procedures for quantitative measures often require information concerning summary statistics of distributions such as mean and variance. However, with genetic studies, the effect size of quantitative traits is traditionally expressed as heritability, a quantity defined as the amount of phenotypic variation in the population that can be ascribed to the genetic variants among individuals. Heritability is hard to transform into summary statistics. Therefore, general power calculation procedures cannot be used directly in GWA studies. The development of appropriate statistical methods and a user-friendly software package to address this problem would be welcomed. Results This paper presents GWAPower, a statistical software package of power calculation designed for GWA studies with quantitative traits, where genetic effect is defined as heritability. Based on several popular one-degree-of-freedom genetic models, this method avoids the need to specify the non-centrality parameter of the F-distribution under the alternative hypothesis. Therefore, it can use heritability information directly without approximation. In GWAPower, the power calculation can be easily adjusted for adding covariates and linkage disequilibrium information. An example is provided to illustrate GWAPower, followed by discussions. Conclusions GWAPower is a user-friendly free software package for calculating statistical power based on heritability in GWA studies with quantitative traits. The software is freely available at: http://dl.dropbox.com/u/10502931/GWAPower.zip

  20. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    Directory of Open Access Journals (Sweden)

    Chaeyoung Lee

    2012-11-01

    Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.

  1. The application of statistical methods to assess economic assets

    Directory of Open Access Journals (Sweden)

    D. V. Dianov

    2017-01-01

    Full Text Available The article is devoted to consideration and evaluation of machinery, equipment and special equipment, methodological aspects of the use of standards for assessment of buildings and structures in current prices, the valuation of residential, specialized houses, office premises, assessment and reassessment of existing and inactive military assets, the application of statistical methods to obtain the relevant cost estimates.The objective of the scientific article is to consider possible application of statistical tools in the valuation of the assets, composing the core group of elements of national wealth – the fixed assets. Firstly, capital tangible assets constitute the basis of material base of a new value creation, products and non-financial services. The gain, accumulated of tangible assets of a capital nature is a part of the gross domestic product, and from its volume and specific weight in the composition of GDP we can judge the scope of reproductive processes in the country.Based on the methodological materials of the state statistics bodies of the Russian Federation, regulations of the theory of statistics, which describe the methods of statistical analysis such as the index, average values, regression, the methodical approach is structured in the application of statistical tools to obtain value estimates of property, plant and equipment with significant accumulated depreciation. Until now, the use of statistical methodology in the practice of economic assessment of assets is only fragmentary. This applies to both Federal Legislation (Federal law № 135 «On valuation activities in the Russian Federation» dated 16.07.1998 in edition 05.07.2016 and the methodological documents and regulations of the estimated activities, in particular, the valuation activities’ standards. A particular problem is the use of a digital database of Rosstat (Federal State Statistics Service, as to the specific fixed assets the comparison should be carried

  2. Designing image segmentation studies: Statistical power, sample size and reference standard quality.

    Science.gov (United States)

    Gibson, Eli; Hu, Yipeng; Huisman, Henkjan J; Barratt, Dean C

    2017-12-01

    Segmentation algorithms are typically evaluated by comparison to an accepted reference standard. The cost of generating accurate reference standards for medical image segmentation can be substantial. Since the study cost and the likelihood of detecting a clinically meaningful difference in accuracy both depend on the size and on the quality of the study reference standard, balancing these trade-offs supports the efficient use of research resources. In this work, we derive a statistical power calculation that enables researchers to estimate the appropriate sample size to detect clinically meaningful differences in segmentation accuracy (i.e. the proportion of voxels matching the reference standard) between two algorithms. Furthermore, we derive a formula to relate reference standard errors to their effect on the sample sizes of studies using lower-quality (but potentially more affordable and practically available) reference standards. The accuracy of the derived sample size formula was estimated through Monte Carlo simulation, demonstrating, with 95% confidence, a predicted statistical power within 4% of simulated values across a range of model parameters. This corresponds to sample size errors of less than 4 subjects and errors in the detectable accuracy difference less than 0.6%. The applicability of the formula to real-world data was assessed using bootstrap resampling simulations for pairs of algorithms from the PROMISE12 prostate MR segmentation challenge data set. The model predicted the simulated power for the majority of algorithm pairs within 4% for simulated experiments using a high-quality reference standard and within 6% for simulated experiments using a low-quality reference standard. A case study, also based on the PROMISE12 data, illustrates using the formulae to evaluate whether to use a lower-quality reference standard in a prostate segmentation study. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Low statistical power in biomedical science: a review of three human research domains

    Science.gov (United States)

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409

  4. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    Science.gov (United States)

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  5. The Statistics Concept Inventory: Development and analysis of a cognitive assessment instrument in statistics

    Science.gov (United States)

    Allen, Kirk

    The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented

  6. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    continuous measures and discuss their strengths and weaknesses. Different methods are illustrated using actual data from the `Delay in diagnosis of cancer in general practice´ project in Aarhus, Denmark. Subjects and Methods: We use weighted kappa-statistic, intraclass correlation coefficient (ICC......Background: Clinical research often involves study of agreement amongst observers. Agreement can be measured in different ways, and one can obtain quite different values depending on which method one uses. Objective: We review the approaches that have been discussed to assess the agreement between......), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product...

  7. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  8. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    Science.gov (United States)

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  9. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  10. Statistical modelling of space-time processes with application to wind power

    DEFF Research Database (Denmark)

    Lenzi, Amanda

    . This thesis aims at contributing to the wind power literature by building and evaluating new statistical techniques for producing forecasts at multiple locations and lead times using spatio-temporal information. By exploring the features of a rich portfolio of wind farms in western Denmark, we investigate...... different types of models and provide several forms of predictions. Starting with spatial prediction, we then extend the methodology to spatio-temporal prediction of individual wind farms and aggregated wind power at monitored locations as well as at locations where recent observations are not available. We...... propose spatial models for predicting wind power generation at two different time scales: for annual average wind power generation and for a high temporal resolution (typically wind power averages over 15-min time steps). In both cases, we use a spatial hierarchical statistical model in which spatial...

  11. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  12. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature

    National Research Council Canada - National Science Library

    Denes Szucs; John P A Ioannidis

    2017-01-01

    We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently...

  13. Statistical Pattern-Based Assessment of Structural Health Monitoring Data

    Directory of Open Access Journals (Sweden)

    Mohammad S. Islam

    2014-01-01

    Full Text Available In structural health monitoring (SHM, various sensors are installed at critical locations of a structure. The signals from sensors are either continuously or periodically analyzed to determine the state and performance of the structure. An objective comparison of the sensor data at different time ranges is essential for assessing the structural condition or excessive load experienced by the structure which leads to potential damage in the structure. The objectives of the current study are to establish a relationship between the data from various sensors to estimate the reliability of the data and potential damage using the statistical pattern matching techniques. In order to achieve these goals, new methodologies based on statistical pattern recognition techniques have been developed. The proposed methodologies have been developed and validated using sensor data obtained from an instrumented bridge and road test data from heavy vehicles. The application of statistical pattern matching techniques are relatively new in SHM data interpretation and current research demonstrates that it has high potential in assessing structural conditions, especially when the data are noisy and susceptible to environmental disturbances.

  14. Statistical power analysis for hemodynamic cardiovascular safety pharmacology studies in beagle dogs.

    Science.gov (United States)

    Chiang, Alan Y; Smith, Wendell C; Main, Bradley W; Sarazan, R Dustan

    2004-01-01

    We studied the statistical power of a replicated Latin square design where eight animals each receive a vehicle control and three dose levels of a drug on four separate dosing days. Cardiovascular parameters evaluated in the study were systolic arterial pressure, diastolic arterial pressure, left ventricular heart rate, and dP/dt(max). Observations were simulated based on historical data and drug response profiles from cardiovascular safety pharmacology studies conducted at Lilly Research Laboratories. Statistical analysis for treatment effects was performed using a linear mixed model. Monotonicity of dose response was examined using sequential linear trend tests based on ordinal spacing of dose levels. The replicated Latin square design for cardiovascular safety pharmacology studies is shown to have at least an 80% power of detecting changes from control of at least a 10% increment in systolic and diastolic pressure and a 15% increment in heart rate and dP/dt(max). The power is not sensitive to the shape of dose response profile over time. Several unique features of our statistical power evaluation include the comparison of different covariance structures and drug response profiles. The procedure can also be applied to future power evaluations of other cardiovascular parameters, such as the QT interval, and the loss of statistical power due to missing observations.

  15. Nuclear power plant security assessment technical manual.

    Energy Technology Data Exchange (ETDEWEB)

    O' Connor, Sharon L.; Whitehead, Donnie Wayne; Potter, Claude S., III

    2007-09-01

    This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and

  16. Components of Statistical Thinking and Implications for Instruction and Assessment.

    Science.gov (United States)

    Chance, Beth L.

    This paper focuses on statistical thinking as the third arm of statistical development. The paper opens with a survey of recent definitions of statistical thinking and then attempts to differentiate statistical thinking from statistical literacy and statistical reasoning. Implications for instruction are traced, emphasizing beginning courses for…

  17. Statistical power as a function of Cronbach alpha of instrument questionnaire items.

    Science.gov (United States)

    Heo, Moonseong; Kim, Namhee; Faith, Myles S

    2015-10-14

    In countless number of clinical trials, measurements of outcomes rely on instrument questionnaire items which however often suffer measurement error problems which in turn affect statistical power of study designs. The Cronbach alpha or coefficient alpha, here denoted by C(α), can be used as a measure of internal consistency of parallel instrument items that are developed to measure a target unidimensional outcome construct. Scale score for the target construct is often represented by the sum of the item scores. However, power functions based on C(α) have been lacking for various study designs. We formulate a statistical model for parallel items to derive power functions as a function of C(α) under several study designs. To this end, we assume fixed true score variance assumption as opposed to usual fixed total variance assumption. That assumption is critical and practically relevant to show that smaller measurement errors are inversely associated with higher inter-item correlations, and thus that greater C(α) is associated with greater statistical power. We compare the derived theoretical statistical power with empirical power obtained through Monte Carlo simulations for the following comparisons: one-sample comparison of pre- and post-treatment mean differences, two-sample comparison of pre-post mean differences between groups, and two-sample comparison of mean differences between groups. It is shown that C(α) is the same as a test-retest correlation of the scale scores of parallel items, which enables testing significance of C(α). Closed-form power functions and samples size determination formulas are derived in terms of C(α), for all of the aforementioned comparisons. Power functions are shown to be an increasing function of C(α), regardless of comparison of interest. The derived power functions are well validated by simulation studies that show that the magnitudes of theoretical power are virtually identical to those of the empirical power. Regardless

  18. Intelligent Techniques for Power Systems Vulnerability Assessment

    Directory of Open Access Journals (Sweden)

    Mohamed A. El-Sharkawi

    2002-06-01

    Full Text Available With power grids considered national security matters, the reliable operation of the system is of top priority to utilities.  This concern is amplified by the utility’s deregulation, which increases the system’s openness while simultaneously decreasing the applied degree of control.  Vulnerability Assessment (VA deals with the power system’s ability to continue to provide service in case of an unforeseen catastrophic contingency.  Such contingencies may include unauthorized tripping, breaks in communication links, sabotage or intrusion by external agents, human errors, natural calamities and faults.  These contingencies could lead to a disruption of service to part or all of the system.  The service disruption is known as outage or blackout.  The paper outlines an approach by which feature extraction and boundary tracking can be implemented to achieve on line vulnerability assessment.

  19. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  20. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  1. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  2. Assessing vascular endothelial function using frequency and rank order statistics

    Science.gov (United States)

    Wu, Hsien-Tsai; Hsu, Po-Chun; Sun, Cheuk-Kwan; Liu, An-Bang; Lin, Zong-Lin; Tang, Chieh-Ju; Lo, Men-Tzung

    2013-08-01

    Using frequency and rank order statistics (FROS), this study analyzed the fluctuations in arterial waveform amplitudes recorded from an air pressure sensing system before and after reactive hyperemia (RH) induction by temporary blood flow occlusion to evaluate the vascular endothelial function of aged and diabetic subjects. The modified probability-weighted distance (PWD) calculated from the FROS was compared with the dilatation index (DI) to evaluate its validity and sensitivity in the assessment of vascular endothelial function. The results showed that the PWD can provide a quantitative determination of the structural changes in the arterial pressure signals associated with regulation of vascular tone and blood pressure by intact vascular endothelium after the application of occlusion stress. Our study suggests that the use of FROS is a reliable noninvasive approach to the assessment of vascular endothelial degeneration in aging and diabetes.

  3. Statistical wave climate projections for coastal impact assessments

    Science.gov (United States)

    Camus, P.; Losada, I. J.; Izaguirre, C.; Espejo, A.; Menéndez, M.; Pérez, J.

    2017-09-01

    Global multimodel wave climate projections are obtained at 1.0° × 1.0° scale from 30 Coupled Model Intercomparison Project Phase 5 (CMIP5) global circulation model (GCM) realizations. A semi-supervised weather-typing approach based on a characterization of the ocean wave generation areas and the historical wave information from the recent GOW2 database are used to train the statistical model. This framework is also applied to obtain high resolution projections of coastal wave climate and coastal impacts as port operability and coastal flooding. Regional projections are estimated using the collection of weather types at spacing of 1.0°. This assumption is feasible because the predictor is defined based on the wave generation area and the classification is guided by the local wave climate. The assessment of future changes in coastal impacts is based on direct downscaling of indicators defined by empirical formulations (total water level for coastal flooding and number of hours per year with overtopping for port operability). Global multimodel projections of the significant wave height and peak period are consistent with changes obtained in previous studies. Statistical confidence of expected changes is obtained due to the large number of GCMs to construct the ensemble. The proposed methodology is proved to be flexible to project wave climate at different spatial scales. Regional changes of additional variables as wave direction or other statistics can be estimated from the future empirical distribution with extreme values restricted to high percentiles (i.e., 95th, 99th percentiles). The statistical framework can also be applied to evaluate regional coastal impacts integrating changes in storminess and sea level rise.

  4. Epidemiologic studies of adverse effects of anti-retroviral drugs: how well is statistical power reported.

    Science.gov (United States)

    Halpern, Scott D; Barton, Todd D; Gross, Robert; Hennessy, Sean; Berlin, Jesse A; Strom, Brian L

    2005-03-01

    To determine whether there is a difference in average statistical power between pharmacoepidemiologic studies of anti-retroviral adverse drug effects (ADEs) sponsored by for-profit versus non-profit organizations. We studied all published pharmacoepidemiologic studies of ADEs associated with the 15 anti-retroviral drugs approved through the end of 1999. A priori, the primary outcome was the power of each study to detect a clinically important difference in the risk for an adverse effect among patients exposed to the study drug(s). We could not evaluate this outcome because of the infrequent reporting of power calculations. We instead report the distribution of studies across a 5-tiered measure of adequacy of reporting of statistical power, as well as the sponsorship of these studies. Of 48 studies meeting our inclusion criteria, only 1 (2%) reported either a completed, a priori power calculation or sufficient details for readers to calculate the power to detect a pre-defined, clinically important effect. Thirty-five studies (73%) reported the minimum information required for sophisticated readers to determine the power to detect an event rate of interest to them; 6 additional studies (13%) reported confidence intervals around at least one summary effect measure and 6 (13%) provided no indication of power or uncertainty. Of the 41 studies for which sponsorship was determined, only 3 (7%) were sponsored by for-profit organizations. The poor reporting of statistical power in this sample suggests a need for guidelines to improve the reporting of pharmacoepidemiologic studies of ADEs. Future research is needed to determine whether the observed paucity of industry-sponsored observational studies of anti-retroviral ADEs extends to other clinical areas, and if so, to identify the causes of this phenomenon. Copyright 2004 John Wiley & Sons, Ltd.

  5. Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.

    Science.gov (United States)

    Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G

    2016-04-01

    Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting.

  6. Geotechnical assessments of upgrading power transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Andrew [Coffey Geotechnics Ltd., Harrogate (United Kingdom)

    2012-11-01

    One of the consequences of increasing demand for energy is a corresponding requirement for increased energy distribution. This trend is likely to be magnified by the current tendency to generate power in locations remote from centres of population. New power transmission routes are expensive and awkward to develop, and there are therefore benefits to be gained by upgrading existing routes. However, this in turn raises problems of a different nature. The re-use of any structure must necessarily imply the acceptance of unknowns. The upgrading of transmission lines is no exception to this, particularly when assessing foundations, which in their nature are not visible. A risk-based approach is therefore used. This paper describes some of the geotechnical aspects of the assessment of electric power transmission lines for upgrading. It briefly describes the background, then discusses some of the problems encountered and the methods used to address them. These methods are based mainly on information obtained from desk studies and walkover surveys, with a limited amount of intrusive investigation. (orig.)

  7. A computational framework for estimating statistical power and planning hypothesis-driven experiments involving one-dimensional biomechanical continua.

    Science.gov (United States)

    Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos

    2018-01-03

    Statistical power assessment is an important component of hypothesis-driven research but until relatively recently (mid-1990s) no methods were available for assessing power in experiments involving continuum data and in particular those involving one-dimensional (1D) time series. The purpose of this study was to describe how continuum-level power analyses can be used to plan hypothesis-driven biomechanics experiments involving 1D data. In particular, we demonstrate how theory- and pilot-driven 1D effect modeling can be used for sample-size calculations for both single- and multi-subject experiments. For theory-driven power analysis we use the minimum jerk hypothesis and single-subject experiments involving straight-line, planar reaching. For pilot-driven power analysis we use a previously published knee kinematics dataset. Results show that powers on the order of 0.8 can be achieved with relatively small sample sizes, five and ten for within-subject minimum jerk analysis and between-subject knee kinematics, respectively. However, the appropriate sample size depends on a priori justifications of biomechanical meaning and effect size. The main advantage of the proposed technique is that it encourages a priori justification regarding the clinical and/or scientific meaning of particular 1D effects, thereby robustly structuring subsequent experimental inquiry. In short, it shifts focus from a search for significance to a search for non-rejectable hypotheses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...

  9. Statistical power of likelihood ratio and Wald tests in latent class models with covariates

    NARCIS (Netherlands)

    Gudicha, D.W.; Schmittmann, V.D.; Vermunt, J.K.

    2017-01-01

    This paper discusses power and sample-size computation for likelihood ratio and Wald testing of the significance of covariate effects in latent class models. For both tests, asymptotic distributions can be used; that is, the test statistic can be assumed to follow a central Chi-square under the null

  10. Statistical modeling of the power grid from a wind farm standpoint

    DEFF Research Database (Denmark)

    Farajzadehbibalan, Saber; Ramezani, Mohammad H.; Nielsen, Peter

    2017-01-01

    In this study, we derive a statistical model of a power grid from the wind farm's standpoint based on dynamic principal component analysis. The main advantages of our model compared to the previously developed models are twofold. Firstly, our proposed model benefits from logged data of an offshor...

  11. A testing procedure for wind turbine generators based on the power grid statistical model

    DEFF Research Database (Denmark)

    Farajzadehbibalan, Saber; Ramezani, Mohammad Hossein; Nielsen, Peter

    2017-01-01

    In this study, a comprehensive test procedure is developed to test wind turbine generators with a hardware-in-loop setup. The procedure employs the statistical model of the power grid considering the restrictions of the test facility and system dynamics. Given the model in the latent space, the j...

  12. Statistical assessment – as a part of security assessment applied to a block cipher

    Directory of Open Access Journals (Sweden)

    Ioana Roxana Dragomir

    2016-09-01

    Full Text Available The security provided by the block cipher algorithms is a top modern approached issue within the specific researches. The security assessment of a block algorithm involves the assessing and testing of components, the statistical testing of algorithm and last, but not least, the cryptanalysis. This paper intends to approach a part of this issue by presenting one of the security assessment stages, namely the component analysis, together with the statistical testing of the block cipher. The statistical testing may be considered as a first step in the security assessing of a block algorithm. The statistical testing of randomness of the block cipher represents an essential phase, both, for the research-development process, and for the assessment process applied to the cryptographic primitives, taking into account that the block algorithms are used on a large scale in cryptographic applications. Assessing them, from a cryptographic point of view, is a highly complex task, which cannot be efficiently accomplished by formal methods. This paper presents several statistical methods of carrying out a security assessment on a block algorithm.

  13. Condition Assessment for Power Transformer Using Health Index

    Science.gov (United States)

    Wang, Jian; Wu, Kuihua; Zhu, Wenbing; Gu, Chao

    2017-05-01

    To improve the forecasting accuracy and ensure reliable and stable operation of transformer, based on health index estimation model is proposed. The transformer data is divided into different levels and parts, so multi-parameter statistical analysis is carried on. The indicator system is scored and weighted by computing history data (inspection and maintenance, family defects, basic information and loading history) and condition data (such as routine test data). The condition parameters, which are classified on the component level, are scored and weighted, using statistical tools calculation. By using the statistical tools SPSS (statistical product and service solutions), multivariate statistical analysis was carried out. On the basis of studying the relationship between various parameters, a health evaluation model, which is based on contribution analysis, is presented. A condition-based evaluation tool, that quantifies power transformer degradation and clarifies the relationship between each health index, is put forward. Results are presented to verify the validity and feasibility of evaluating model and assessment algorithm. This paper provides a scientific method for the transmitting and transforming field.

  14. Assessment of Statistical Methodologies and Pitfalls of Dissertations Carried Out at National Cancer Institute, Cairo University

    Science.gov (United States)

    Allam, Rasha M; Noaman, Maissa K; Moneer, Manar M; Elattar, Inas A

    2017-01-01

    Purpose: To identify statistical errors and pitfalls in dissertations performed as part of the requirements for the Medical Doctorate (MD) degree at the National Cancer Institute (NCI), Cairo University (CU) to improve the quality of medical research. Methods: A critical assessment of 62 MD dissertations conducted in 3 departments at NCI, CU, between 2009 and 2013 was carried out regarding statistical methodology and presentation of the results. To detect differences in study characteristics over time, grouping was into two periods; 2009-2010 and 2011-2013. Results: Statistical methods were appropriate in only 13 studies (24.5%). The most common statistical tests applied were chi-square, log-rank, and Mann-Whitney tests. Four studies estimated sample size and/or power. Only 37.1% and 38.7% of dissertation results supported aims and answered the research questions, respectively. Most of results were misinterpreted (82.3%) with misuse of statistical terminology (77.4%). Tabular and graphical data display was independently informative in only 36 dissertations (58.1%) with accurate titles and labels in only 17 (27.4%). Statistical tests fulfilled the assumptions only in 29 studies; with evident misuse in 33. Ten dissertations reported non-significance regarding their primary outcome measure; the median power of the test was 35.5% (range: 6-60%). There was no significant change in the characteristics between the time periods. Conclusion: MD dissertations at NCI have many epidemiological and statistical defects that may compromise the external validity of the results. It is recommended to involve a biostatistician from the very start to improve study design, sample size calculation, end points estimation and measures. Creative Commons Attribution License

  15. Comparative Assessment of Statistical Downscaling Methods for Precipitation in Florida

    Science.gov (United States)

    Goly, A.; Teegavarapu, R. S.

    2012-12-01

    Several statistical downscaling models have been developed in the past couple of decades to assess the hydrologic impacts of climate change by projecting the station-scale hydrological variables from large-scale atmospheric variables simulated by General Circulation Models (GCMs). GCMs in general are capable in capturing the large-scale circulation patterns and correctly model smoothly varying fields such as surface pressure, but it is extremely unlikely that these models properly reproduce non-smooth fields such as precipitation. This paper presents and compares different statistical downscaling methods involving Multiple Linear Regression (MLR), Positive Coefficient Regression (PCR), Stepwise Regression (SWR) and Support Vector Machine (SVM) for estimation of rainfall in the state of Florida, USA, which is considered to be a climatically sensitive region. The explanatory variables/predictors used in the current study are mean sea-level pressure, air temperature, geo-potential height, specific humidity, U-wind and V-wind. Principal Component Analysis (PCA) and Fuzzy C-Means (FCM) clustering techniques are used to reduce the dimensionality of the dataset and identify the circulation patterns on precipitation in different clusters. Downscaled precipitation data obtained from widely used Bias-Correction Spatial Disaggregation (BCSD) downscaling technique is compared along with the other downscaling methods. The performance of the models is evaluated using various performance measures and it was found that the SVM model performed better than all the other models in reproducing most monthly rainfall statistics at 18 locations. Output from the third generation Canadian Global Climate Model (CGCM3) GCM for A1B scenario was used for future precipitation projection. For the projection period 2001-2010, MLR was used and evaluated as a substitute to the traditional spatial interpolation linking the variables at the GCM grid to NCEP grid scale. It has been found that the

  16. A comparative assessment of statistical methods for extreme weather analysis

    Science.gov (United States)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus

  17. Environmental assessment of submarine power cables

    Energy Technology Data Exchange (ETDEWEB)

    Isus, Daniel; Martinez, Juan D. [Grupo General Cable Sistemas, S.A., 08560-Manlleu, Barcelona (Spain); Arteche, Amaya; Del Rio, Carmen; Madina, Virginia [Tecnalia Research and Innovation, 20009 San Sebastian (Spain)

    2011-03-15

    Extensive analyses conducted by the European Community revealed that offshore wind energy have relatively benign effects on the marine environment by comparison to other forms of electric power generation [1]. However, the materials employed in offshore wind power farms suffer major changes to be confined to the marine environment at extreme conditions: saline medium, hydrostatic pressure... which can produce an important corrosion effect. This phenomenon can affect on the one hand, to the material from the structural viewpoint and on the other hand, to the marine environment. In this sense, to better understand the environmental impacts of generating electricity from offshore wind energy, this study evaluated the life cycle assessment for some new designs of submarine power cables developed by General Cable. To achieve this goal, three approaches have been carried out: leaching tests, eco-toxicity tests and Life Cycle Assessment (LCA) methodologies. All of them are aimed to obtaining quantitative data for environmental assessment of selected submarine cables. LCA is a method used to assess environmental aspects and potential impacts of a product or activity. LCA does not include financial and social factors, which means that the results of an LCA cannot exclusively form the basis for assessment of a product's sustainability. Leaching tests results allowed to conclude that pH of seawater did not significantly changed by the presence of submarine three-core cables. Although, it was slightly higher in case of broken cable, pH values were nearly equals. Concerning to the heavy metals which could migrate to the aquatic medium, there were significant differences in both scenarios. The leaching of zinc is the major environmental concern during undersea operation of undamaged cables whereas the fully sectioned three-core cable produced the migration of significant quantities of copper and iron apart from the zinc migrated from the galvanized steel. Thus, the tar

  18. Statistical power of model selection strategies for genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Zheyang Wu

    2009-07-01

    Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the

  19. Water Polo Game-Related Statistics in Women’s International Championships: Differences and Discriminatory Power

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.

    2012-01-01

    The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID

  20. Statistical evaluation of malfunctions in wind power plants; Statistische Fehlerauswertungen beim Windkraftwerksbetrieb zur Optimierung der Verfuegbarkeit

    Energy Technology Data Exchange (ETDEWEB)

    Fleischer, C.; Sucrow, W. [E.ON Energy Projects GmbH, Muenchen (Germany)

    2007-07-01

    New challenges by wind energy are risen, this means, that availabilities of wind power plants have to be increased as well as minimisation of breakdowns. Ultimately a retrenchment of operational management costs can be realised/achieved. The article gives a review of operational management's taken efforts to adjust manufacturer's frequently inadequate documentation to provide operations - after strenuous classification - with statistical evaluations of incoming error messages. These statistical evaluations lead to the identification of breakdown times as well as idleness times. Finally operation's costs can be monitored in cent per kilowatt hour. (orig.)

  1. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation

    DEFF Research Database (Denmark)

    Hacke, Peter; Spataru, Sergiu

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated...... stress temperature, their use to determine the maximum power at 25°C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power...

  2. Nordel - Availability statistics for thermal power plants 1995. (Denmark, Finland, Sweden); Nordel - Tillgaenglighetsstatistik foer vaermekraft 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    The power companies of Denmark, Finland and Sweden have agreed on almost identical procedures for the recording and analysing of data describing the availability of power producing units over a certain capacity. Since 1975 the data for all three countries have been summarized and published in a joint report. The purpose of this report is to present some basic information about the operation of power producing units in the three countries. Referring to the report, companies or bodies will be able to exchange more detailed information with other companies or bodies in any of the countries. The report includes power producing units using fossil fuels, nuclear power plants and gas turbines. The information is presented separately for each country with a joint NORDEL statistics for units using fossil fuels, arranged in separate groups according to the type of fossil fuel which is used. The grouping of power producing units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. The definitions in NORDEL`s `Tillgaenglighetsbegrepp foer vaermekraft` (`The Concept of Availability for Thermal Power`), September 1977, are used in this report. The basic data for the availability are in accordance with the recommendations of UNIPEDE/WEC. (author).

  3. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  4. Power Systems Development Facility. Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    The objective of the PSDF would be to provide a modular facility which would support the development of advanced, pilot-scale, coal-based power systems and hot gas clean-up components. These pilot-scale components would be designed to be large enough so that the results can be related and projected to commercial systems. The facility would use a modular approach to enhance the flexibility and capability for testing; consequently, overall capital and operating costs when compared with stand-alone facilities would be reduced by sharing resources common to different modules. The facility would identify and resolve technical barrier, as well as-provide a structure for long-term testing and performance assessment. It is also intended that the facility would evaluate the operational and performance characteristics of the advanced power systems with both bituminous and subbituminous coals. Five technology-based experimental modules are proposed for the PSDF: (1) an advanced gasifier module, (2) a fuel cell test module, (3) a PFBC module, (4) a combustion gas turbine module, and (5) a module comprised of five hot gas cleanup particulate control devices. The final module, the PCD, would capture coal-derived ash and particles from both the PFBC and advanced gasifier gas streams to provide for overall particulate emission control, as well as to protect the combustion turbine and the fuel cell.

  5. Waste Heat to Power Market Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Elson, Amelia [ICF International, Fairfax, VA (United States); Tidball, Rick [ICF International, Fairfax, VA (United States); Hampson, Anne [ICF International, Fairfax, VA (United States)

    2015-03-01

    Waste heat to power (WHP) is the process of capturing heat discarded by an existing process and using that heat to generate electricity. In the industrial sector, waste heat streams are generated by kilns, furnaces, ovens, turbines, engines, and other equipment. In addition to processes at industrial plants, waste heat streams suitable for WHP are generated at field locations, including landfills, compressor stations, and mining sites. Waste heat streams are also produced in the residential and commercial sectors, but compared to industrial sites these waste heat streams typically have lower temperatures and much lower volumetric flow rates. The economic feasibility for WHP declines as the temperature and flow rate decline, and most WHP technologies are therefore applied in industrial markets where waste heat stream characteristics are more favorable. This report provides an assessment of the potential market for WHP in the industrial sector in the United States.

  6. Statistical modeling of an integrated boiler for coal fired thermal power plant.

    Science.gov (United States)

    Chandrasekharan, Sreepradha; Panda, Rames Chandra; Swaminathan, Bhuvaneswari Natrajan

    2017-06-01

    The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R2 analysis and ANOVA (Analysis of Variance). The dependability of the process variable (temperature) on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM) supported by DOE (design of experiments) are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant.

  7. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.

  8. Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance

    Directory of Open Access Journals (Sweden)

    Matteo Colombo

    2017-09-01

    Full Text Available Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i the prior credibility of an explanatory hypothesis, (ii the causal framing of the hypothesis, (iii the perceived generalizability of the explanation, and (iv the relation of statistical relevance between hypothesis and evidence. Collectively, the results of our five experiments support the hypothesis that the prior credibility of a causal explanation plays a central role in explanatory reasoning: first, because of the presence of strong main effects on judgments of explanatory power, and second, because of the gate-keeping role it has for other factors. Highly credible explanations are not susceptible to causal framing effects, but they are sensitive to the effects of normatively relevant factors: the generalizability of an explanation, and its statistical relevance for the evidence. These results advance current literature in the philosophy and psychology of explanation in three ways. First, they yield a more nuanced understanding of the determinants of judgments of explanatory power, and the interaction between these factors. Second, they show the close relationship between prior beliefs and explanatory power. Third, they elucidate the nature of abductive reasoning.

  9. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  10. Environmental impact assessment of coal power plants in operation

    Science.gov (United States)

    Bartan, Ayfer; Kucukali, Serhat; Ar, Irfan

    2017-11-01

    Coal power plants constitute an important component of the energy mix in many countries. However, coal power plants can cause several environmental risks such as: climate change and biodiversity loss. In this study, a tool has been proposed to calculate the environmental impact of a coal-fired thermal power plant in operation by using multi-criteria scoring and fuzzy logic method. We take into account the following environmental parameters in our tool: CO, SO2, NOx, particulate matter, fly ash, bottom ash, the cooling water intake impact on aquatic biota, and the thermal pollution. In the proposed tool, the boundaries of the fuzzy logic membership functions were established taking into account the threshold values of the environmental parameters which were defined in the environmental legislation. Scoring of these environmental parameters were done with the statistical analysis of the environmental monitoring data of the power plant and by using the documented evidences that were obtained during the site visits. The proposed method estimates each environmental impact factor level separately and then aggregates them by calculating the Environmental Impact Score (EIS). The proposed method uses environmental monitoring data and documented evidence instead of using simulation models. The proposed method has been applied to the 4 coal-fired power plants that have been operation in Turkey. The Environmental Impact Score was obtained for each power plant and their environmental performances were compared. It is expected that those environmental impact assessments will contribute to the decision-making process for environmental investments to those plants. The main advantage of the proposed method is its flexibility and ease of use.

  11. Environmental impact assessment of coal power plants in operation

    Directory of Open Access Journals (Sweden)

    Bartan Ayfer

    2017-01-01

    Full Text Available Coal power plants constitute an important component of the energy mix in many countries. However, coal power plants can cause several environmental risks such as: climate change and biodiversity loss. In this study, a tool has been proposed to calculate the environmental impact of a coal-fired thermal power plant in operation by using multi-criteria scoring and fuzzy logic method. We take into account the following environmental parameters in our tool: CO, SO2, NOx, particulate matter, fly ash, bottom ash, the cooling water intake impact on aquatic biota, and the thermal pollution. In the proposed tool, the boundaries of the fuzzy logic membership functions were established taking into account the threshold values of the environmental parameters which were defined in the environmental legislation. Scoring of these environmental parameters were done with the statistical analysis of the environmental monitoring data of the power plant and by using the documented evidences that were obtained during the site visits. The proposed method estimates each environmental impact factor level separately and then aggregates them by calculating the Environmental Impact Score (EIS. The proposed method uses environmental monitoring data and documented evidence instead of using simulation models. The proposed method has been applied to the 4 coal-fired power plants that have been operation in Turkey. The Environmental Impact Score was obtained for each power plant and their environmental performances were compared. It is expected that those environmental impact assessments will contribute to the decision-making process for environmental investments to those plants. The main advantage of the proposed method is its flexibility and ease of use.

  12. Statistical assessment of biosimilarity based on relative distance between follow-on biologics.

    Science.gov (United States)

    Kang, Seung-Ho; Chow, Shein-Chung

    2013-02-10

    In this paper, we propose a new three-arm parallel design to investigate biosimilarity between a biosimilar product and an innovator biological product by using relative distance based on the absolute mean differences. In the proposed design, one arm is for the biosimilar product and the other two arms are for the innovator biological product. The distance between the biosimilar product and the innovator biological product is defined by the absolute mean different between two products. Similarly, the distance between the innovator biological products from two difference batches is defined. The relative distance is defined as the ratio of the two distances whose denominator is the distance between the innovator biological products from two different batches. In the proposed design, if the relative distance is less than a prespecified margin, we claim that the two products are claimed to be biosimilar. The statistical test based on the ratio estimator and the linearization method are developed to assess biosimilarity. The power functions of two tests are derived in large sample and compared numerically. Because the statistical test based on the ratio estimator is more powerful than the linearization method, we recommend the statistical test based on the ratio estimator. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Structural vulnerability assessment of electric power grids

    NARCIS (Netherlands)

    Koç, Y.; Warnier, M.; Kooij, R.E.; Brazier, F.

    2014-01-01

    Cascading failures are the typical reasons of blackouts in power grids. The grid topology plays an important role in determining the dynamics of cascading failures in power grids. Measures for vulnerability analysis are crucial to assure a higher level of robustness of power grids. Metrics from

  14. Increased statistical power with combined independent randomization tests used with multiple-baseline design.

    Science.gov (United States)

    Tyrrell, Pascal N; Corey, Paul N; Feldman, Brian M; Silverman, Earl D

    2013-06-01

    Physicians often assess the effectiveness of treatments on a small number of patients. Multiple-baseline designs (MBDs), based on the Wampold-Worsham (WW) method of randomization and applied to four subjects, have relatively low power. Our objective was to propose another approach with greater power that does not suffer from the time requirements of the WW method applied to a greater number of subjects. The power of a design that involves the combination of two four-subject MBDs was estimated using computer simulation and compared with the four- and eight-subject designs. The effect of a delayed linear response to treatment on the power of the test was also investigated. Power was found to be adequate (>80%) for a standardized mean difference (SMD) greater than 0.8. The effect size associated with 80% power from combined tests was smaller than that of the single four-subject MBD (SMD=1.3) and comparable with the eight-subject MBD (SMD=0.6). A delayed linear response to the treatment resulted in important reductions in power (20-35%). By combining two four-subject MBD tests, an investigator can detect better effect sizes (SMD=0.8) and be able to complete a comparatively timelier and feasible study. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. A statistical framework for power calculations in ChIP-seq experiments

    Science.gov (United States)

    Zuo, Chandler; Keleş, Sündüz

    2014-01-01

    Motivation: ChIP-seq technology enables investigators to study genome-wide binding of transcription factors and mapping of epigenomic marks. Although the availability of basic analysis tools for ChIP-seq data is rapidly increasing, there has not been much progress on the related design issues. A challenging question for designing a ChIP-seq experiment is how deeply should the ChIP and the control samples be sequenced? The answer depends on multiple factors some of which can be set by the experimenter based on pilot/preliminary data. The sequencing depth of a ChIP-seq experiment is one of the key factors that determine whether all the underlying targets (e.g. binding locations or epigenomic profiles) can be identified with a targeted power. Results: We developed a statistical framework named CSSP (ChIP-seq Statistical Power) for power calculations in ChIP-seq experiments by considering a local Poisson model, which is commonly adopted by many peak callers. Evaluations with simulations and data-driven computational experiments demonstrate that this framework can reliably estimate the power of a ChIP-seq experiment at different sequencing depths based on pilot data. Furthermore, it provides an analytical approach for calculating the required depth for a targeted power while controlling the false discovery rate at a user-specified level. Hence, our results enable researchers to use their own or publicly available data for determining required sequencing depths of their ChIP-seq experiments and potentially make better use of the multiplexing functionality of the sequencers. Evaluation of power for multiple public ChIP-seq datasets indicate that, currently, typical ChIP-seq studies are powered well for detecting large fold changes of ChIP enrichment over the control sample, but they have considerably less power for detecting smaller fold changes. Availability: Available at www.stat.wisc.edu/∼zuo/CSSP. Contact: keles@stat.wisc.edu Supplementary information: Supplementary

  16. Power flow as a complement to statistical energy analysis and finite element analysis

    Science.gov (United States)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  17. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  18. Theoretical Foundations and Mathematical Formalism of the Power-Law Tailed Statistical Distributions

    Directory of Open Access Journals (Sweden)

    Giorgio Kaniadakis

    2013-09-01

    Full Text Available We present the main features of the mathematical theory generated by the √ κ-deformed exponential function expκ(x = ( 1 + κ2x2 + κx1/κ, with 0 ≤ κ < 1, developed in the last twelve years, which turns out to be a continuous one parameter deformation of the ordinary mathematics generated by the Euler exponential function. The κ-mathematics has its roots in special relativity and furnishes the theoretical foundations of the κ-statistical mechanics predicting power law tailed statistical distributions, which have been observed experimentally in many physical, natural and artificial systems. After introducing the κ-algebra, we present the associated κ-differential and κ-integral calculus. Then, we obtain the corresponding κ-exponential and κ-logarithm functions and give the κ-version of the main functions of the ordinary mathematics.

  19. WATER POLO GAME-RELATED STATISTICS IN WOMEN'S INTERNATIONAL CHAMPIONSHIPS: DIFFERENCES AND DISCRIMINATORY POWER

    Directory of Open Access Journals (Sweden)

    Yolanda Escalante

    2012-09-01

    Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots

  20. Statistical power considerations in genotype-based recall randomized controlled trials

    DEFF Research Database (Denmark)

    Atabaki-Pasdar, Naeimeh; Ohlsson, Mattias; Shungin, Dmitry

    2016-01-01

    -metformin interactions (vs. placebo) using incidence rates, gene-drug interaction effect estimates and allele frequencies reported in the DPP for the rs8065082 SLC47A1 variant, a metformin transported encoding locus. We then calculated statistical power for interactions between genetic risk scores (GRS), metformin...... treatment and intensive lifestyle intervention (ILI) given a range of sampling frames, clinical trial sample sizes, interaction effect estimates, and allele frequencies; outcomes were type 2 diabetes incidence (time-to-event) and change in small LDL particles (continuous outcome). Thereafter, we compared...... simulated scenarios, GBR trials have substantially higher power to observe gene-drug and gene-lifestyle interactions than same-sized conventional RCTs. GBR trials are becoming popular for validation of gene-treatment interactions; our analyses illustrate the strengths and weaknesses of this design....

  1. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2009-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate closures....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...

  2. Statistical Analysis of Meteorological Disasters in the Jiangsu Power Grid and Prevention Methods

    Science.gov (United States)

    Lu, Y. L.; Tao, F. B.; Hu, C. B.; Zhou, Z. C.; Sa, L.

    2017-10-01

    With the obvious global climate change and the rapid development of an industrial economy in Jiangsu, the number of failures in power equipment caused by meteorological disasters has been increasing every year, especially contamination flashover faults and wind deflection faults. Based on extensive investigation and analysis, this paper conducts a statistical analysis of the temporal and spatial distribution of the typical meteorological conditions in the Jiangsu area. This paper also analyses the effects of the main meteorological disasters and defensive measures that have been taken to prevent them in recent years, and the paper summarizes the characteristics of the faults in a typical case. Finally, the paper gives some overall advice for preventative measures from the perspective of design planning, applications of new technology, disaster warnings and so on. These recommendations may provide an important reference for the design, maintenance and accident prevention for power grids that experience similar meteorological disasters.

  3. The rat bone marrow micronucleus test--study design and statistical power.

    Science.gov (United States)

    Hayes, Julie; Doherty, Ann T; Adkins, Deborah J; Oldman, Karen; O'Donovan, Michael R

    2009-09-01

    Although the rodent bone marrow micronucleus test has been in routine use for over 20 years, little work has been published to support its experimental design and all this has used the mouse rather than the rat. When it was decided to change the strain of rat routinely used in this laboratory to the Han Wistar, a preliminary study was performed to investigate the possible factors influencing experimental variability and to use statistical tools to examine possible study designs. Subsequently, a historical database comprising of vehicle controls accumulated from 65 studies was used to establish test acceptance criteria and a strategy for analysing equivocal results. The following conclusions were made: (i) no statistically significant differences were observed in experimental variability within or between control animals; although not statistically significant, the majority of experimental variability seen was found to be between separate counts on the same slide, with minimal differences found between duplicate slides from the same rat or between individual rats; (ii) power analyses showed that, if an equivocal result is obtained after scoring 2000 immature erythrocytes (IE), it is appropriate to re-code the slides and score an additional 4000 IE, i.e. analysing a total of 6000 IE; no meaningful increase in statistical power is gained by scoring >6000 IE; this is consistent with the variability observed between separate counts on the same slide; (iii) there was no significant difference between the control micronucleated immature erythrocyte (MIE) values at 24 and 48 h after dosing or between males and females; therefore, if an unusually low control value at either time point results in apparent small increases in MIE in a treated group, it is valid to pool control values from both time points for clarification and (iv) similar statistical power can be achieved by scoring 2000 IE from seven rats or 4000 IE from five rats, respectively. However, this is based only

  4. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    Science.gov (United States)

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  5. Statistical Power in Two-Level Hierarchical Linear Models with Arbitrary Number of Factor Levels.

    Science.gov (United States)

    Shin, Yongyun; Lafata, Jennifer Elston; Cao, Yu

    2018-03-01

    As the US health care system undergoes unprecedented changes, the need for adequately powered studies to understand the multiple levels of main and interaction factors that influence patient and other care outcomes in hierarchical settings has taken center stage. We consider two-level models where n lower-level units are nested within each of J higher-level clusters (e.g. patients within practices and practices within networks) and where two factors may have arbitrary a and b factor levels, respectively. Both factors may represent a × b treatment combinations, or one of them may be a pretreatment covariate. Consideration of both factors at the same higher or lower hierarchical level, or one factor per hierarchical level yields a cluster (C), multisite (M) or split-plot randomized design (S). We express statistical power to detect main, interaction, or any treatment effects as a function of sample sizes (n, J), a and b factor levels, intraclass correlation ρ and effect sizes δ given each design d ∈ {C, M, S}. The power function given a, b, ρ, δ and d determines adequate sample sizes to achieve a minimum power requirement. Next, we compare the impact of the designs on power to facilitate selection of optimal design and sample sizes in a way that minimizes the total cost given budget and logistic constraints. Our approach enables accurate and conservative power computation with a priori knowledge of only three effect size differences regardless of how large a × b is, simplifying previously available computation methods for health services and other researches.

  6. Statistical distribution of pioneer vegetation: the role of local stream power

    Science.gov (United States)

    Crouzy, B.; Edmaier, K.; Pasquale, N.; Perona, P.

    2012-12-01

    We discuss results of a flume experiment on the colonization of river bars by pioneer vegetation and focus on the role of a non-constant local stream power in determining the statistics of riverbed and uprooted biomass characteristics (root length, number of roots and stem height). We verify the conjecture that the statistical distribution of riverbed vegetation subject to the action of flood disturbances can be obtained from the distribution before the flooding events combined to the relative resilience to floods of plants with given traits. By using fast growing vegetation (Avena sativa) we can access the competition between growth-associated root stabilization and uprooting by floods. We fix the hydrological timescale (in our experiment the arrival time between periodic flooding events) to be comparable with the biological timescales (plant germination and development rates). The sequence of flooding events is repeated until the surviving riverbed vegetation has grown out of scale with the uprooting capacity of the flood and the competition has stopped. We present and compare laboratory results obtained using converging and parallel channel walls to highlight the role of the local stream power in the process. The convergent geometry can be seen as the laboratory analog of different field conditions. At the scale of the bar it represents regions with flow concentration while at a larger scale it is an analog for a river with convergent banks, for an example see the work on the Tagliamento River by Gurnell and Petts (2006). As expected, we observe that for the convergent geometry the variability in the local stream power results in a longer tail of the distribution of root length for uprooted material compared to parallel geometries with an equal flow rate. More surprisingly, the presence of regions with increased stream power in the convergent experiments allows us to access two fundamentally different regimes. We observe that depending on the development stage

  7. CAPABILITY ASSESSMENT OF MEASURING EQUIPMENT USING STATISTIC METHOD

    Directory of Open Access Journals (Sweden)

    Pavel POLÁK

    2014-10-01

    Full Text Available Capability assessment of the measurement device is one of the methods of process quality control. Only in case the measurement device is capable, the capability of the measurement and consequently production process can be assessed. This paper deals with assessment of the capability of the measuring device using indices Cg and Cgk.

  8. How Teachers Understand and Use Power in Alternative Assessment

    OpenAIRE

    Kelvin H. K. Tan

    2012-01-01

    “Alternative assessment” is an increasingly common and popular discourse in education. The potential benefit of alternative assessment practices is premised on significant changes in assessment practices. However, assessment practices embody power relations between institutions, teachers and students, and these power relationships determine the possibility and the extent of actual changes in assessment practices. Labelling a practice as “alternative assessment does not guarantee meaningful de...

  9. Electronic Marking of Statistics Assessments for Bioscience Students

    Science.gov (United States)

    Ayres, Karen L.; Underwood, Fiona M.

    2010-01-01

    We describe the main features of a program written to perform electronic marking of quantitative or simple text questions. One of the main benefits is that it can check answers for being consistent with earlier errors, so can cope with a range of numerical questions. We summarise our experience of using it in a statistics course taught to 200…

  10. Using multivariate statistical analysis to assess changes in water ...

    African Journals Online (AJOL)

    Abstract. Multivariate statistical analysis was used to investigate changes in water chemistry at 5 river sites in the Vaal Dam catch- ... analysis (CCA) showed that the environmental variables used in the analysis, discharge and month of sampling, explained ...... DINGENEN R, WILD O and ZENG G (2006) The global atmos-.

  11. Using multivariate statistical analysis to assess changes in water ...

    African Journals Online (AJOL)

    Multivariate statistical analysis was used to investigate changes in water chemistry at 5 river sites in the Vaal Dam catchment, draining the Highveld grasslands. These grasslands receive more than 8 kg sulphur (S) ha-1·year-1 and 6 kg nitrogen (N) ha-1·year-1 via atmospheric deposition. It was hypothesised that between ...

  12. Assessing Statistical Aspects of Test Fairness with Structural Equation Modelling

    Science.gov (United States)

    Kline, Rex B.

    2013-01-01

    Test fairness and test bias are not synonymous concepts. Test bias refers to statistical evidence that the psychometrics or interpretation of test scores depend on group membership, such as gender or race, when such differences are not expected. A test that is grossly biased may be judged to be unfair, but test fairness concerns the broader, more…

  13. Assessing Knowledge Structures in a Constructive Statistical Learning Environment

    NARCIS (Netherlands)

    P.P.J.L. Verkoeijen (Peter); Tj. Imbos; M.W.J. van de Wiel (Margje); M.P.F. Berger; H.G. Schmidt (Henk)

    2002-01-01

    textabstractIn this report, the method of free recall is put forward as a tool to evaluate a prototypical statistical learning environment. A number of students from the faculty of Health Sciences, Maastricht University, the Netherlands, were required to write down whatever they could remember of a

  14. Classification of Underlying Causes of Power Quality Disturbances: Deterministic versus Statistical Methods

    Directory of Open Access Journals (Sweden)

    Emmanouil Styvaktakis

    2007-01-01

    Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.

  15. How Teachers Understand and Use Power in Alternative Assessment

    Directory of Open Access Journals (Sweden)

    Kelvin H. K. Tan

    2012-01-01

    Full Text Available “Alternative assessment” is an increasingly common and popular discourse in education. The potential benefit of alternative assessment practices is premised on significant changes in assessment practices. However, assessment practices embody power relations between institutions, teachers and students, and these power relationships determine the possibility and the extent of actual changes in assessment practices. Labelling a practice as “alternative assessment does not guarantee meaningful departure from existing practice. Recent research has warned that assessment practices in education cannot be presumed to empower students in ways that enhance their learning. This is partly due to a tendency to speak of power in assessment in undefined terms. Hence, it would be useful to identify the types of power present in assessment practices and the contexts which give rise to them. This paper seeks to examine power in the context of different ways that alternative assessment is practiced and understood by teachers. Research on teachers’ conceptions of alternative assessment is presented, and each of the conceptions is then analysed for insights into teachers’ meanings and practices of power. In particular, instances of sovereign, epistemological and disciplinary power in alternative assessment are identified to illuminate new ways of understanding and using alternative assessment.

  16. Air-chemistry "turbulence": power-law scaling and statistical regularity

    Directory of Open Access Journals (Sweden)

    H.-m. Hsu

    2011-08-01

    Full Text Available With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3 and aerosol (PM10 measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days peaks are also identified with the continuous wavelet transform (CWT. The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of −5/3 and −1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, −3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.

  17. Assessment of nuclear power plant siting methods

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.D.; Hobbs, B.F.; Pierce, B.L.; Meier, P.M.

    1979-11-01

    Several different methods have been developed for selecting sites for nuclear power plants. This report summarizes the basic assumptions and formal requirements of each method and evaluates conditions under which each is correctly applied to power plant siting problems. It also describes conditions under which different siting methods can produce different results. Included are criteria for evaluating the skill with which site-selection methods have been applied.

  18. A statistical survey of ultralow-frequency wave power and polarization in the Hermean magnetosphere.

    Science.gov (United States)

    James, Matthew K; Bunce, Emma J; Yeoman, Timothy K; Imber, Suzanne M; Korth, Haje

    2016-09-01

    We present a statistical survey of ultralow-frequency wave activity within the Hermean magnetosphere using the entire MErcury Surface, Space ENvironment, GEochemistry, and Ranging magnetometer data set. This study is focused upon wave activity with frequencies Wave activity is mapped to the magnetic equatorial plane of the magnetosphere and to magnetic latitude and local times on Mercury using the KT14 magnetic field model. Wave power mapped to the planetary surface indicates the average location of the polar cap boundary. Compressional wave power is dominant throughout most of the magnetosphere, while azimuthal wave power close to the dayside magnetopause provides evidence that interactions between the magnetosheath and the magnetopause such as the Kelvin-Helmholtz instability may be driving wave activity. Further evidence of this is found in the average wave polarization: left-handed polarized waves dominate the dawnside magnetosphere, while right-handed polarized waves dominate the duskside. A possible field line resonance event is also presented, where a time-of-flight calculation is used to provide an estimated local plasma mass density of ∼240 amu cm-3.

  19. Statistical analysis of regional capital and operating costs for electric power generation

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, L.R.; Myers, M.G.; Herrman, J.A.; Provanizano, A.J.

    1977-10-01

    This report presents the results of a three and one-half-month study conducted for Brookhaven National Lab. to develop capital and operating cost relationships for seven electric power generating technologies: oil-, coal-, gas-, and nuclear-fired steam-electric plants, hydroelectric plants, and gas-turbine plants. The methodology is based primarily on statistical analysis of Federal Power Commission data for plant construction and annual operating costs. The development of cost-output relationships for electric power generation is emphasized, considering the effects of scale, technology, and location on each of the generating processes investigated. The regional effects on cost are measured at the Census Region level to be consistent with the Brookhaven Multi-Regional Energy and Interindustry Regional Model of the United States. Preliminary cost relationships for system-wide costs - transmission, distribution, and general expenses - were also derived. These preliminary results cover the demand for transmission and distribution capacity and operating and maintenance costs in terms of system-service characteristics. 15 references, 6 figures, 23 tables.

  20. Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests

    Science.gov (United States)

    Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.

    2015-01-01

    The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.

  1. Assessment - A Powerful Lever for Learning

    Directory of Open Access Journals (Sweden)

    Lorna Earl

    2010-05-01

    Full Text Available Classroom assessment practices have been part of schooling for hundreds of years. There are, however, new findings about the nature of learning and about the roles that assessment can play in enhancing learning for all students. This essay provides a brief history of the changing role of assessment in schooling, describes three different purposes for assessment and foreshadows some implications that shifting to a more differentiated view of assessment can have for policy, practice and research.

  2. Integrated Assessment of National Power Sources Using AHP Technique

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seong Ho; Kim, Tae Woon; Ha, Jae Joo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Chang, Soon H. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2005-07-01

    Here, various national power sources including conventional as well as renewable energy systems are comparatively assessed in view of multicriteria decisionmaking (MCDM) spaces. The main objectives of this work are to understand priority of power sources and to figure out nuclear power's synergetic role in the national energy sector.

  3. Theoretical assessment of image analysis: statistical vs structural approaches

    Science.gov (United States)

    Lei, Tianhu; Udupa, Jayaram K.

    2003-05-01

    Statistical and structural methods are two major approaches commonly used in image analysis and have demonstrated considerable success. The former is based on statistical properties and stochastic models of the image and the latter utilizes geometric and topological models. In this study, Markov random field (MRF) theory/model based image segmentation and Fuzzy Connectedness (FC) theory/Fuzzy connected objeect delineation are chosen as the representatives for these two approaches, respectively. The comparative study is focused on their theoretical foundations and main operative procedures. The MRF is defined on a lattice and the associated neighborhood system and is based on the Markov property. The FC method is defined on a fuzzy digital space and is based on fuzzy relations. Locally, MRF is characterized by potentials of cliques, and FC is described by fuzzy adjacency and affinity relations. Globally, MRF is characterized by Gibbs distribution, and FC is described by fuzzy connectedness. The task of MRF model based image segmentation is toe seek a realization of the embedded MRF through a two-level operation: partitioning and labeling. The task of FC object delineation is to extract a fuzzy object from a given scene, through a two-step operation: recognition and delineation. Theoretical foundations which underly statistical and structural approaches and the principles of the main operative procedures in image segmentation by these two approaches demonstrate more similarities than differences between them. Two approaches can also complement each other, particularly in seed selection, scale formation, affinity and object membership function design for FC and neighbor set selection and clique potential design for MRF.

  4. A generalized model to estimate the statistical power in mitochondrial disease studies involving 2×k tables.

    Directory of Open Access Journals (Sweden)

    Jacobo Pardo-Seco

    Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.

  5. Statistical Techniques for Assessing water‐quality effects of BMPs

    Science.gov (United States)

    Walker, John F.

    1994-01-01

    Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)

  6. Computer-aided assessment in statistics: the CAMPUS project

    Directory of Open Access Journals (Sweden)

    Neville Hunt

    1998-12-01

    Full Text Available The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assessor. While computer-aided assessment (CAA has an enthusiastic following, it is not clear to many teachers that it either reduces workloads or reduces the risk of cheating. In an ideal world, most teachers would prefer to give individual attention and personal feedback to each student when marking their work. In this sense CAA must be seen as second best and will therefore be used only if it is seen to offer significant benefits in terms of reduced workloads or increased validity.

  7. Hysteresis and Power-Law Statistics during temperature induced martensitic transformation

    Energy Technology Data Exchange (ETDEWEB)

    Paul, Arya [S.N.Bose National Centre for Basic Sciences, JD-Block, Sector-III, Salt Lake, Kolkata 700098 (India); Sengupta, Surajit [Indian Association for the Cultivation of Science, 2A and 2B Raja S.C.Mullick Rd, Jadavpur, Kolkata 700032 (India); Rao, Madan, E-mail: aryapaul@gmail.com [National Center for Biological Sciences, GKVK, Bellary Road, Bangalore 560065 (India)

    2011-09-15

    We study hysteresis in temperature induced martensitic transformation using a 2D model solid exhibiting a square to rhombic structural transition. We find that upon quenching, the high temperature square phase, martensites are nucleated at sites having large non-affineness and ultimately invades the whole of the high temperature square phase. On heating the martensite, the high temperature square phase is restored. The transformation proceeds through avalanches. The amplitude and the time-duration of these avalanches exhibit power-law statistics both during heating and cooling of the system. The exponents corresponding to heating and cooling are different thereby indicating that the nucleation and dissolution of the product phase follows different transformation mechanism.

  8. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  9. Generation of statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2007-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...... generation. The approach is evaluated on the test case of a multi-MW wind farm over a period of more than two years. Its interest for a large range of applications is discussed....

  10. Analytical probability density function for the statistics of the ENSO phenomenon: Asymmetry and power law tail

    Science.gov (United States)

    Bianucci, M.

    2016-01-01

    This letter has two main goals. The first one is to give a physically reasonable explanation for the use of stochastic models for mimicking the apparent random features of the El Ninõ-Southern Oscillation (ENSO) phenomenon. The second one is to obtain, from the theory, an analytical expression for the equilibrium density function of the anomaly sea surface temperature, an expression that fits the data from observations well, reproducing the asymmetry and the power law tail of the histograms of the NIÑO3 index. We succeed in these tasks exploiting some recent theoretical results of the author in the field of the dynamical origin of the stochastic processes. More precisely, we apply this approach to the celebrated recharge oscillator model (ROM), weakly interacting by a multiplicative term, with a general deterministic complex forcing (Madden-Julian Oscillations, westerly wind burst, etc.), and we obtain a Fokker-Planck equation that describes the statistical behavior of the ROM.

  11. A model of the statistical power of comparative genome sequence analysis.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2005-01-01

    Full Text Available Comparative genome sequence analysis is powerful, but sequencing genomes is expensive. It is desirable to be able to predict how many genomes are needed for comparative genomics, and at what evolutionary distances. Here I describe a simple mathematical model for the common problem of identifying conserved sequences. The model leads to some useful rules of thumb. For a given evolutionary distance, the number of comparative genomes needed for a constant level of statistical stringency in identifying conserved regions scales inversely with the size of the conserved feature to be detected. At short evolutionary distances, the number of comparative genomes required also scales inversely with distance. These scaling behaviors provide some intuition for future comparative genome sequencing needs, such as the proposed use of "phylogenetic shadowing" methods using closely related comparative genomes, and the feasibility of high-resolution detection of small conserved features.

  12. Monitoring birds of prey in Finland: a summary of methods, trends, and statistical power.

    Science.gov (United States)

    Saurola, Pertti

    2008-09-01

    In Finland, Comprehensive Surveys to monitor numbers and productivity of four endangered species of birds of prey were started in the early 1970s. In 1982, the Ringing Center launched the Raptor Grid, a nationwide monitoring program for all other bird-of-prey species based on 10 x 10 km study plots of the Finnish National Grid. The annual total of study plots surveyed by voluntary raptor ringers has averaged 120. Since 1986, additional information on breeding performance has been collected using the Raptor Questionnaire. In 2006, more than 44 262 potential nest sites of birds of prey were inspected, and 12 963 occupied territories, including 8149 active nests, were found and reported by ringers. The population trend during 1982-2006 has been significantly negative in six species and positive or neutral in 18 species. Statistical power of the time series of numbers and productivity has been adequate for all species except the microtine specialists.

  13. How to evaluate an Early Warning System? Towards a United Statistical Framework for Assessing Financial Crises Forecasting Methods

    OpenAIRE

    Candelon, B.; Dumitrescu, E-I.; Hurlin, C.

    2010-01-01

    This paper proposes a new statistical framework originating from the traditional credit-scoring literature, to evaluate currency crises Early Warning Systems (EWS). Based on an assessment of the predictive power of panel logit and Markov frameworks, the panel logit model is outperforming the Markov switching specitcations. Furthermore, the introduction of forward-looking variables clearly improves the forecasting properties of the EWS. This improvement confirms the adequacy of the second gene...

  14. Wind power in Germany: Performance statistics; Leistungsstatistik der Windkraftanlagen in Deutschland

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    Statistics are presented for May, June, July and August 2002 for 3,976 plants with a total capacity of 3,264.56 MW. Only those plants were considered for which data were provided to Ingenieur-Werkstatt Energietechnik, i.e. about one third of Germany's wind power plants. This is one of the most comprehensive statistics world-wide, second only to 'Windstats'. Germany is the world's leading market for wind power. At the end of August 2002, a total of 12,425 systems were operated in Germany with a total capacity of 10,195.11 MW. [German] In der folgenden Statistik werden die Leistungen der Monate Mai, Juni, Juli und August 2002 vorgestellt. Es wurden fuer 3.976 Anlagen mit einer Gesamtleistung von 3.264,56 MW von Herstellern und Betreibern die Leistungsdaten gemeldet. Es sind nur die Anlagen in der folgenden Statistik vorgestellt, die ihre monatlichen Ertraege an die Ingenieur-Werkstatt Energietechnik melden. Die Anlagen ohne Leistungsmeldung sind nicht abgedruckt. Diese Statistik ist also unvollstaendig, nur etwa jede dritte Windkraftanlage in Deutschland ist erfasst. Es ist aber nach 'Windstats' die umfangreichste Datensammlung ueber die Leistung von Windkraftanlagen, die weltweit besteht. In den ersten acht Monaten des Jahres 2002 wurden 1.113 Anlagen mit 1.507,72 MW neu errichtet. (Monat 1-8/2001: 906 Anlagen mit 1.126 MW) Neuer Rekordverdacht: Deutschland ist staerkster Markt der Welt. In der Bundesrepublik arbeiten mit Stand 31.8.2002 insgesamt 12.425 Anlagen mit 10.195,11 MW Leistung. (orig.)

  15. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature

    OpenAIRE

    Szucs, Denes; Ioannidis, JPA

    2017-01-01

    We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64-1.46) for nominally statistically significant results and D = 0.24 (0.11-0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement th...

  16. Development and application of a statistical quality assessment method for dense-graded mixes.

    Science.gov (United States)

    2004-08-01

    This report describes the development of the statistical quality assessment method and the procedure for mapping the measures obtained from the quality assessment method to a composite pay factor. The application to dense-graded mixes is demonstrated...

  17. Statistical analysis of wind power in the region of Veracruz (Mexico)

    Energy Technology Data Exchange (ETDEWEB)

    Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)

    2009-06-15

    The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)

  18. Assessing Research Data Deposits and Usage Statistics within IDEALS

    Directory of Open Access Journals (Sweden)

    Christie A. Wiley

    2017-12-01

    Full Text Available Objectives:This study follows up on previous work that began examining data deposited in an institutional repository. The work here extends the earlier study by answering the following lines of research questions: (1 What is the file composition of datasets ingested into the University of Illinois at Urbana-Champaign (UIUC campus repository? Are datasets more likely to be single-file or multiple-file items? (2 What is the usage data associated with these datasets? Which items are most popular? Methods: The dataset records collected in this study were identified by filtering item types categorized as “data” or “dataset” using the advanced search function in IDEALS. Returned search results were collected in an Excel spreadsheet to include data such as the Handle identifier, date ingested, file formats, composition code, and the download count from the item’s statistics report. The Handle identifier represents the dataset record’s persistent identifier. Composition represents codes that categorize items as single or multiple file deposits. Date available represents the date the dataset record was published in the campus repository. Download statistics were collected via a website link for each dataset record and indicates the number of times the dataset record has been downloaded. Once the data was collected, it was used to evaluate datasets deposited into IDEALS. Results: A total of 522 datasets were identified for analysis covering the period between January 2007 and August 2016. This study revealed two influxes occurring during the period of 2008-2009 and in 2014. During the first timeframe a large number of PDFs were deposited by the Illinois Department of Agriculture. Whereas, Microsoft Excel files were deposited in 2014 by the Rare Books and Manuscript Library. Single-file datasets clearly dominate the deposits in the campus repository. The total download count for all datasets was 139,663 and the average downloads per month per

  19. Evaluation of a Regional Monitoring Program's Statistical Power to Detect Temporal Trends in Forest Health Indicators

    Science.gov (United States)

    Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.

    2014-09-01

    Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.

  20. Evaluation of a regional monitoring program's statistical power to detect temporal trends in forest health indicators

    Science.gov (United States)

    Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.

    2014-01-01

    Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.

  1. Development of a statistical oil spill model for risk assessment.

    Science.gov (United States)

    Guo, Weijun

    2017-11-01

    To gain a better understanding of the impacts from potential risk sources, we developed an oil spill model using probabilistic method, which simulates numerous oil spill trajectories under varying environmental conditions. The statistical results were quantified from hypothetical oil spills under multiple scenarios, including area affected probability, mean oil slick thickness, and duration of water surface exposed to floating oil. The three sub-indices together with marine area vulnerability are merged to compute the composite index, characterizing the spatial distribution of risk degree. Integral of the index can be used to identify the overall risk from an emission source. The developed model has been successfully applied in comparison to and selection of an appropriate oil port construction location adjacent to a marine protected area for Phoca largha in China. The results highlight the importance of selection of candidates before project construction, since that risk estimation from two adjacent potential sources may turn out to be significantly different regarding hydrodynamic conditions and eco-environmental sensitivity. Copyright © 2017. Published by Elsevier Ltd.

  2. Assessing heart rate variability through wavelet-based statistical measures.

    Science.gov (United States)

    Wachowiak, Mark P; Hay, Dean C; Johnson, Michel J

    2016-10-01

    Because of its utility in the investigation and diagnosis of clinical abnormalities, heart rate variability (HRV) has been quantified with both time and frequency analysis tools. Recently, time-frequency methods, especially wavelet transforms, have been applied to HRV. In the current study, a complementary computational approach is proposed wherein continuous wavelet transforms are applied directly to ECG signals to quantify time-varying frequency changes in the lower bands. Such variations are compared for resting and lower body negative pressure (LBNP) conditions using statistical and information-theoretic measures, and compared with standard HRV metrics. The latter confirm the expected lower variability in the LBNP condition due to sympathetic nerve activity (e.g. RMSSD: p=0.023; SDSD: p=0.023; LF/HF: p=0.018). Conversely, using the standard Morlet wavelet and a new transform based on windowed complex sinusoids, wavelet analysis of the ECG within the observed range of heart rate (0.5-1.25Hz) exhibits significantly higher variability, as measured by frequency band roughness (Morlet CWT: p=0.041), entropy (Morlet CWT: p=0.001), and approximate entropy (Morlet CWT: p=0.004). Consequently, this paper proposes that, when used with well-established HRV approaches, time-frequency analysis of ECG can provide additional insights into the complex phenomenon of heart rate variability. Copyright © 2016. Published by Elsevier Ltd.

  3. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    Directory of Open Access Journals (Sweden)

    L. Palatella

    2010-07-01

    Full Text Available In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy, Ebro river basin (Spain, Po valley (Italy and Antalya province (Turkey. We performed the statistical downscaling using Canonical Correlation Analysis (CCA in two versions: in one case Principal Component Analysis (PCA filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500 as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.

  4. An Evaluation of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    Science.gov (United States)

    Neumann, David L.; Hood, Michelle; Neumann, Michelle M.

    2012-01-01

    In a previous report, Neumann (2010) described the use of interactive computer-based simulations in the assessment of statistical concepts. This assessment approach combined declarative knowledge of statistics with experiences in interacting with computer-based simulations. The aim of the present study was to conduct a systematic evaluation of the…

  5. Racialized customer service in restaurants: a quantitative assessment of the statistical discrimination explanatory framework.

    Science.gov (United States)

    Brewster, Zachary W

    2012-01-01

    Despite popular claims that racism and discrimination are no longer salient issues in contemporary society, racial minorities continue to experience disparate treatment in everyday public interactions. The context of full-service restaurants is one such public setting wherein racial minority patrons, African Americans in particular, encounter racial prejudices and discriminate treatment. To further understand the causes of such discriminate treatment within the restaurant context, this article analyzes primary survey data derived from a community sample of servers (N = 200) to assess the explanatory power of one posited explanation—statistical discrimination. Taken as a whole, findings suggest that while a statistical discrimination framework toward understanding variability in servers’ discriminatory behaviors should not be disregarded, the framework’s explanatory utility is limited. Servers’ inferences about the potential profitability of waiting on customers across racial groups explain little of the overall variation in subjects’ self-reported discriminatory behaviors, thus suggesting that other factors not explored in this research are clearly operating and should be the focus of future inquires.

  6. Type I error and statistical power of the Mantel-Haenszel procedure for detecting DIF: a meta-analysis.

    Science.gov (United States)

    Guilera, Georgina; Gómez-Benito, Juana; Hidalgo, Maria Dolores; Sánchez-Meca, Julio

    2013-12-01

    This article presents a meta-analysis of studies investigating the effectiveness of the Mantel-Haenszel (MH) procedure when used to detect differential item functioning (DIF). Studies were located electronically in the main databases, representing the codification of 3,774 different simulation conditions, 1,865 related to Type I error and 1,909 to statistical power. The homogeneity of effect-size distributions was assessed by the Q statistic. The extremely high heterogeneity in both error rates (I² = 94.70) and power (I² = 99.29), due to the fact that numerous studies test the procedure in extreme conditions, means that the main interest of the results lies in explaining the variability in detection rates. One-way analysis of variance was used to determine the effects of each variable on detection rates, showing that the MH test was more effective when purification procedures were used, when the data fitted the Rasch model, when test contamination was below 20%, and with sample sizes above 500. The results imply a series of recommendations for practitioners who wish to study DIF with the MH test. A limitation, one inherent to all meta-analyses, is that not all the possible moderator variables, or the levels of variables, have been explored. This serves to remind us of certain gaps in the scientific literature (i.e., regarding the direction of DIF or variances in ability distribution) and is an aspect that methodologists should consider in future simulation studies. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    Science.gov (United States)

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  8. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants

    Science.gov (United States)

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325

  9. Composite power system adequacy assessment based on postoptimal analysis

    OpenAIRE

    SAFDARIAN, Amir; FIRUZABAD, Mahmud FOTUHI; AMINIFAR, Farrokh

    2014-01-01

    The modeling and evaluation of enormous numbers of contingencies are the most challenging impediments associated with composite power system adequacy assessment, particularly for large-scale power systems. Optimal power flow (OPF) solution, as a widely common approach, is normally employed to model and analyze each individual contingency as an independent problem. However, mathematical representations associated with diverse states are slightly different in one or a few generating un...

  10. Statistics of 150-km echoes over Jicamarca based on low-power VHF observations

    Directory of Open Access Journals (Sweden)

    J. L. Chau

    2006-07-01

    Full Text Available In this work we summarize the statistics of the so-called 150-km echoes obtained with a low-power VHF radar operation at the Jicamarca Radio Observatory (11.97 S, 76.87 W, and 1.3 dip angle at 150-km altitude in Peru. Our results are based on almost four years of observations between August 2001 and July 2005 (approximately 150 days per year. The majority of the observations have been conducted between 08:00 and 17:00 LT. We present the statistics of occurrence of the echoes for each of the four seasons as a function of time of day and altitude. The occurrence frequency of the echoes is ~75% around noon and start decreasing after 15:00 LT and disappear after 17:00 LT in all seasons. As shown in previous campaign observations, the 150-echoes appear at a higher altitude (>150 km in narrow layers in the morning, reaching lower altitudes (~135 km around noon, and disappear at higher altitudes (>150 km after 17:00 LT. We show that although 150-km echoes are observed all year long, they exhibit a clear seasonal variability on altitudinal coverage and the percentage of occurrence around noon and early in the morning. We also show that there is a strong day-to-day variability, and no correlation with magnetic activity. Although our results do not solve the 150-km riddle, they should be taken into account when a reasonable theory is proposed.

  11. Journal of EEA, Vol. 31, 2014 10 POWER QUALITY ASSESSMENT ...

    African Journals Online (AJOL)

    POWER QUALITY ASSESSMENT AND MITIGATION AT WALYA-STEEL. INDUSTRIES AND ETHIO-PLASTIC SHARE COMPANY. Mengesha Mamo and Adefris Merid. School of Electrical and Computer Engineering. Addis Ababa Institute of Technology, Addis Ababa University. ABSTRACT. In this paper, electric power ...

  12. Environmental and exergetic sustainability assessment of power generation from biomass

    NARCIS (Netherlands)

    Stougie, L.; Tsalidis, G.A.; van der Kooi, H.J.; Korevaar, G.

    2017-01-01

    Power generation from biomass is mentioned as a means to make our society more sustainable as it decreases greenhouse gas emissions of fossil origin and reduces the dependency on finite energy carriers, such as coal, oil and natural gas. When assessing the sustainability of power generation from

  13. Statistical power of studies examining the cognitive effects of subthalamic nucleus deep brain stimulation in Parkinson's disease.

    Science.gov (United States)

    Woods, Steven Paul; Rippeth, Julie D; Conover, Emily; Carey, Catherine L; Parsons, Thomas D; Tröster, Alexander I

    2006-02-01

    It has been argued that neuropsychological studies generally possess adequate statistical power to detect large effect sizes. However, low statistical power is problematic in neuropsychological research involving clinical populations and novel interventions for which available sample sizes are often limited. One notable example of this problem is evident in the literature regarding the cognitive sequelae of deep brain stimulation (DBS) of the subthalamic nucleus (STN) in persons with Parkinson's disease (PD). In the current review, a post hoc estimate of the statistical power of 30 studies examining cognitive effects of STN DBS in PD revealed adequate power to detect substantial cognitive declines (i.e., very large effect sizes), but surprisingly low estimated power to detect cognitive changes associated with conventionally small, medium, and large effect sizes. Such wide spread Type II error risk in the STN DBS cognitive outcomes literature may affect the clinical decision-making process as concerns the possible risk of postsurgical cognitive morbidity, as well as conceptual inferences to be drawn regarding the role of the STN in higher-level cognitive functions. Statistical and methodological recommendations (e.g., meta-analysis) are offered to enhance the power of current and future studies examining the neuropsychological sequelae of STN DBS in PD.

  14. Performance statistics of wind power systems in Germany; Leistungsstatistik der Windkraftanlagen in Deutschland

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    Data are presented for the first term of this year. 3,579 plants with a total capacity of 2,780.24 MW are considered, i.e. only those whose data were passed on to Ingenieur-Werkstatt Energietechnik. This includes about one third of Germany's wind power system. The statistics is one of the world's biggest, second only to 'Windstats'. 399 new plants with a total of 431.24 MW were constructed in Germany in the first months of 2002. This makes 11,715 plants with a total capacity of 9,222.53 MW as of 30 April 2002. [German] In der folgenden Statistik werden die Leistungen der Monate Januar, Februar, Maerz und April vorgestellt. Es wurden fuer 3.579 Anlagen mit einer Gesamtleistung von 2.780,24 MW von Herstellern und Betreibern die Leistungsdaten gemeldet. Es sind nur die Anlagen in der folgenden Statistik vorgestellt, die ihre monatlichen Ertraege an die Ingenieur-Werkstatt Energietechnik melden. Die Anlagen ohne Leistungsmeldung sind nicht abgedruckt. Diese Statistik ist also unvollstaendig, nur etwa jede dritte Windkraftanlage in Deutschland ist erfasst. Es ist aber nach 'Windstats' die umfangreichste Datensammlung ueber die Leistung von Windkraftanlagen, die weltweit besteht. In den ersten vier Monaten des Jahres 2002 wurden 399 Anlagen mit 531,24 MW neu errichtet. In der Bundesrepublik arbeiten mit Stand 30.4.2002 insgesamt 11.715 Anlagen mit 9.222,53 MW Leistung. (orig.)

  15. Launch Vehicle Assessment for Space Solar Power

    Science.gov (United States)

    Olds, John R.

    1998-01-01

    A recently completed study at Georgia Tech examined various launch vehicle options for deploying a future constellation of Space Solar Power satellites of the Suntower configuration. One of the motivations of the study was to determine whether the aggressive $400/kg launch price goal established for SSP package delivery would result in an attractive economic scenario for a future RLV developer. That is, would the potential revenue and traffic to be derived from a large scale SSP project be enough of an economic "carrot" to attract an RLV company into developing a new, low cost launch vehicle to address this market. Preliminary results presented in the attached charts show that there is enough economic reward for RLV developers, specifically in the case of the latest large GEO-based Suntower constellations (over 15,500 MT per year delivery for 30 years). For that SSP model, internal rates of return for the 30 year economic scenario exceed 22%. However, up-front government assistance to the RLV developer in terms of ground facilities, operations technologies, guaranteed low-interest rate loans, and partial offsets of some vehicle development expenses is necessary to achieve these positive results. This white paper is meant to serve as a companion to the data supplied in the accompanying charts. It's purpose is to provide more detail on the vehicles and design processes used, to highlight key decisions and issues, and to emphasize key results from each phase of the Georgia Tech study.

  16. Assessment of a satellite power system and six alternative technologies

    Science.gov (United States)

    Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L. S.; Levine, E.; Tanzman, E.

    1981-01-01

    The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and institutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included.

  17. Assessment of a satellite power system and six alternative technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L.S.; Levine, E.; Tanzman, E.

    1981-04-01

    The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and insitutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included. (LEW)

  18. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.

    2015-01-01

    A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from...... and proper detection of power production changes is demonstrated in cases of icing, power derating, operation under noise reduction mode, and incorrect controller input signal. Finally, overviews are illustrated for parks subjected to icing and operating under limited rotational speed. The comparison between...... multiple adjacent turbines contributes further to the correct evaluation of the park overall performance....

  19. Hybrid algorithm for rotor angle security assessment in power systems

    Directory of Open Access Journals (Sweden)

    D. Prasad Wadduwage

    2015-08-01

    Full Text Available Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on the concept of Lyapunov exponents (LEs to determine the transient security of the post-fault power system. Later, the transient secure power swing curves are analysed using an improved Prony algorithm which extracts the dominant oscillatory modes and estimates their damping ratios. The damping ratio is a security measure about the oscillatory security of the post-fault power system subsequent to the contingency. The suitability of the proposed hybrid algorithm for DSA in power systems is illustrated using different contingencies of a 16-generator 68-bus test system and a 50-generator 470-bus test system. The accuracy of the stability conclusions and the acceptable computational burden indicate that the proposed hybrid algorithm is suitable for real-time security assessment with respect to both transient rotor angle stability and oscillatory rotor angle stability under multiple contingencies of the power system.

  20. Fast voltage stability assessment for large-scale power systems

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Y. [Shandong Univ., Jinan (China). School of Electrical Engineering; Wang, L.; Yu, Z. [Shandong Electric Power Co., Jinan (China). Electric Power Control Center

    2007-07-01

    A new method of assessing online voltage stability in large-scale power systems was presented. A local voltage stability index was used to determine weak buses in the system. A case study of the Shandong power system in China was used to demonstrate the accuracy and speed of the method for online applications. The local method was based on the fact that the Thevenin equivalent as determined from the load bus and the apparent load impedance were equal at the point of voltage collapse. Participant buses and key power sources of both reactive and active power transmission paths were determined using electrical distance measurements. Participant buses and key power sources of the reactive and active power transmission paths were also determined. The case study demonstrated that the reactive power reserve of key generators has a significant impact on voltage stability. The study also demonstrated that the voltage stability of the weakest power transmission path can decline or shift when some generators reach their limits. It was concluded that combining voltage stability indices and reactive power reserves increases the accuracy of voltage stability assessments. 11 refs., 3 tabs., 6 figs.

  1. Assessment of alternative power sources for mobile mining machinery

    Science.gov (United States)

    Cairelli, J. E.; Tomazic, W. A.; Evans, D. G.; Klann, J. L.

    1981-01-01

    Alternative mobile power sources for mining applications were assessed. A wide variety of heat engines and energy systems was examined as potential alternatives to presently used power systems. The present mobile power systems are electrical trailing cable, electrical battery, and diesel - with diesel being largely limited in the United States to noncoal mines. Each candidate power source was evaluated for the following requirements: (1) ability to achieve the duty cycle; (2) ability to meet Government regulations; (3) availability (production readiness); (4) market availability; and (5) packaging capability. Screening reduced the list of candidates to the following power sources: diesel, stirling, gas turbine, rankine (steam), advanced electric (batteries), mechanical energy storage (flywheel), and use of hydrogen evolved from metal hydrides. This list of candidates is divided into two classes of alternative power sources for mining applications, heat engines and energy storage systems.

  2. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  3. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    Science.gov (United States)

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  4. Agreement between Pentacam and videokeratography in corneal power assessment.

    Science.gov (United States)

    Savini, Giacomo; Barboni, Piero; Carbonelli, Michele; Hoffer, Kenneth J

    2009-06-01

    To investigate agreement between a rotating Scheimpflug camera (Pentacam, Oculus Optikgeräte GmbH) and two corneal topographers (TMS-2 Topography System, Tomey; and Keratron Scout, Optikon 2000 SpA) in measuring the corneal power of normal eyes. The mean corneal powers calculated by simulated keratometry (SimK) with each topographer were compared to those provided by the Pentacam in 71 patients. Specifically, the corneal power values of the Pentacam included in this analysis were the SimK (calculated using the measured anterior corneal radius and standard keratometric index of 1.3375) and the True net power (calculated using the anterior and posterior corneal curvatures and Gaussian optics formula for thick lenses, where the actual refractive index of the air, cornea, and aqueous humor are entered). Bland-Altman plots were used to investigate agreement and analysis of variance (ANOVA) was performed to detect statistical differences. Although ANOVA did not disclose a statistically significant difference among the mean SimK values (TMS-2: 43.20 +/- 1.51 diopters [D], Keratron Scout: 43.29 +/-1.48 D, Pentacam: 43.25 +/- 1.53 D), the 95% limits of agreement between the TMS-2 and Pentacam and between the Keratron Scout and Pentacam were wide (-1.05 to +0.94 D and -0.95 to +1.02 D, respectively). Agreement was even poorer when considering the mean True net power (42.00 +/- 1.54 D), which was significantly lower than the mean Pentacam SimK (P < .001). Although corneal topography and the Pentacam provide similar SimK values, their data should not be used interchangeably as only moderate agreement exists between them. Corneal power values calculated by the True net power are significantly lower than any SimK and cannot be entered into intraocular lens power formulas.

  5. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  6. Statistical Power Analysis with Microsoft Excel: Normal Tests for One or Two Means as a Prelude to Using Non-Central Distributions to Calculate Power

    Science.gov (United States)

    Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa

    2009-01-01

    This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…

  7. Transient Stability Assessment of Power System with Large Amount of Wind Power Penetration

    DEFF Research Database (Denmark)

    Liu, Leo; Chen, Zhe; Bak, Claus Leth

    2012-01-01

    Recently, the security and stability of power system with large amount of wind power are the concerned issues, especially the transient stability. In Denmark, the onshore and offshore wind farms are connected to distribution system and transmission system respectively. The control and protection...... plants, load consumption level and high voltage direct current (HVDC) transmission links are taken into account. The results presented in this paper are able to provide an early awareness of power system security condition of the western Danish power system....... methodologies of onshore and offshore wind farms definitely affect the transient stability of power system. In this paper, the onshore and offshore wind farms are modeled in detail in order to assess the transient stability of western Danish power system. Further, the computation of critical clearing time (CCT...

  8. Evaluating the statistical power of detecting changes in the abundance of seabirds at sea

    Energy Technology Data Exchange (ETDEWEB)

    Burton, Niall; Maclean, Ilya; Rehfisch, Mark; Skov, Henrik; Thaxter, Chris

    2011-07-01

    Full text: Offshore wind farms may potentially affect bird populations through the displacement of birds due to the disturbance associated with developments, the barrier they present for migrating birds and birds commuting between breeding and feeding areas, habitat change/loss and collision mortality. In current impact assessments it is often assumed that all birds that use the area of a proposed offshore wind farm would be displaced following construction, with some birds also displaced from a surrounding buffer zone. However, the extent to which current monitoring schemes are capable of detecting changes in abundance and options for improving survey protocols have received little attention. We investigated the likelihood of detecting changes in seabird numbers in UK offshore waters. Using aerial survey data, we simulated 50%, 25% and 10% declines and conducted power analyses to determine the probability that such changes could be detected. Additionally, increases in the duration and frequency of surveying were simulated and the influence of spatial scale and variability in bird numbers were also investigated. Current monitoring schemes do not provide adequate means of detecting changes in numbers even when declines are in excess of 50% and assumptions regarding certainty are relaxed to less than 80%. Extending the duration and frequency of surveys would increase the probability of detecting changes, but not to a desirable level. The primary reason why there is a low probability of being able to detect consistent changes is because seabirds are inherently prone to fluctuations in numbers. Explaining some of the variability in bird numbers using environmental and hydro-dynamic co variates would increase the power of detecting changes. (Author)

  9. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  10. National-Scale Wind Resource Assessment for Power Generation (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, E. I.

    2013-08-01

    This presentation describes the current standards for conducting a national-scale wind resource assessment for power generation, along with the risk/benefit considerations to be considered when beginning a wind resource assessment. The presentation describes changes in turbine technology and viable wind deployment due to more modern turbine technology and taller towers and shows how the Philippines national wind resource assessment evolved over time to reflect changes that arise from updated technologies and taller towers.

  11. Probabilistic safety assessment for optimum nuclear power plant life management (PLiM) theory and application of reliability analysis methods for major power plant components

    CERN Document Server

    Arkadov, G V; Rodionov, A N

    2012-01-01

    Probabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Directing preventative maintenance, this title provides a comprehensive review of the theory and application of these methods.$bProbabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Successful calculation of the reliability and ageing of components is critical for forecasting safety and directing preventative maintenance, and Probabilistic safety assessment for optimum nuclear power plant life management provides a comprehensive review of the theory and application of these methods. Part one reviews probabilistic methods for predicting the reliability of equipment. Following an introduction to key terminology, concepts and definitions, formal-statistical and various physico-statistical approaches are discussed. Approaches based on the use of defect-free models are considered, along with those using binomial distribution and models bas...

  12. Power Fingerprinting for Integrity Assessment of Embedded Systems

    OpenAIRE

    Aguayo Gonzalez, Carlos Roberto

    2011-01-01

    This dissertation introduces Power Fingerprinting (PFP), a novel technique for assessing the execution integrity of embedded devices. A PFP monitor is an external device that captures the dynamic power consumption of a processor using fine-grained measurements at the clock-cycle level and applies anomaly detection techniques to determine whether the integrity of the system has been compromised. PFP uses a set of trusted signatures from the target code that are extracted during a pre-character...

  13. Implementation of a Model Output Statistics based on meteorological variable screening for short‐term wind power forecast

    DEFF Research Database (Denmark)

    Ranaboldo, Matteo; Giebel, Gregor; Codina, Bernat

    2013-01-01

    A combination of physical and statistical treatments to post‐process numerical weather predictions (NWP) outputs is needed for successful short‐term wind power forecasts. One of the most promising and effective approaches for statistical treatment is the Model Output Statistics (MOS) technique....... In this study, a MOS based on multiple linear regression is proposed: the model screens the most relevant NWP forecast variables and selects the best predictors to fit a regression equation that minimizes the forecast errors, utilizing wind farm power output measurements as input. The performance of the method...... is evaluated in two wind farms, located in different topographical areas and with different NWP grid spacing. Because of the high seasonal variability of NWP forecasts, it was considered appropriate to implement monthly stratified MOS. In both wind farms, the first predictors were always wind speeds (at...

  14. Statistical study of undulator radiated power by a classical detection system in the mm-wave regime

    Directory of Open Access Journals (Sweden)

    A. Eliran

    2009-05-01

    Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.

  15. Transient stability risk assessment of power systems incorporating wind farms

    DEFF Research Database (Denmark)

    Miao, Lu; Fang, Jiakun; Wen, Jinyu

    2013-01-01

    Large-scale wind farm integration has brought several aspects of challenges to the transient stability of power systems. This paper focuses on the research of the transient stability of power systems incorporating with wind farms by utilizing risk assessment methods. The detailed model of double...... fed induction generator has been established. Wind penetration variation and multiple stochastic factors of power systems have been considered. The process of transient stability risk assessment based on the Monte Carlo method has been described and a comprehensive risk indicator has been proposed....... An investigation has been conducted into an improved 10-generator 39-bus system with a wind farm incorporated to verify the validity and feasibility of the risk assessment method proposed....

  16. Power system cascading risk assessment based on complex network theory

    Science.gov (United States)

    Wang, Zhuoyang; Hill, David J.; Chen, Guo; Dong, Zhao Yang

    2017-09-01

    When a single failure occurs in a vulnerable part of a power system, this may cause a large area cascading event. Therefore, an advanced method that can assess the risks during cascading events is needed. In this paper, an improved complex network model for power system risk assessment is proposed. Risk is defined by consequence and probability of the failures in this model, which are affected by both power factors and network structure. Compared with existing risk assessment models, the proposed one can evaluate the risk of the system comprehensively during a cascading event by combining the topological and electrical information. A new cascading event simulation module is adopted to identify the power grid cascading chain from a system-level view. In addition, simulations are investigated on the IEEE 14 bus system and IEEE 39 bus system respectively to illustrate the performance of the proposed module. The simulation results demonstrate that the proposed method is effective in a power grid risk assessment during cascading event.

  17. Statistical analysis of the variation of floor vibrations in nuclear power plants subject to seismic loads

    Energy Technology Data Exchange (ETDEWEB)

    Jussila, Vilho [VTT Technical Research Centre of Finland Ltd, Kemistintie 3, 02230 Espoo (Finland); Li, Yue [Dept. of Civil Engineering, Case Western Reserve University, Cleveland, OH 44106 (United States); Fülöp, Ludovic, E-mail: ludovic.fulop@vtt.fi [VTT Technical Research Centre of Finland Ltd, Kemistintie 3, 02230 Espoo (Finland)

    2016-12-01

    Highlights: • Floor flexibility plays a non-negligible role in amplifying horizontal vibrations. • COV of in-floor horizontal and vertical acceleration are 0.15–0.25 and 0.25–0.55. • In-floor variation of vibrations is higher in lower floors. • Floor spectra from limited nodes underestimates vibrations by a factor of 1.5–1.75. - Abstract: Floor vibration of a reactor building subjected to seismic loads was investigated, with the aim of quantifying the variability of vibrations on each floor. A detailed 3D building model founded on the bedrock was excited simultaneously in three directions by artificial accelerograms compatible with Finnish ground response spectra. Dynamic simulation for 21 s was carried out using explicit time integration. The extracted results of the simulation were acceleration in several floor locations, transformed to pseudo-acceleration (PSA) spectra in the next stage. At first, the monitored locations on the floors were estimated by engineering judgement in order to arrive at a feasible number of floor nodes for post processing of the data. It became apparent that engineering judgment was insufficient to depict the key locations with high floor vibrations, which resulted in un-conservative vibration estimates. For this reason, a more systematic approach was later considered, in which nodes of the floors were selected with a more refined grid of 2 m. With this method, in addition to the highest PSA peaks in all directions, the full vibration distribution in each floor can be determined. A statistical evaluation of the floor responses was also carried out in order to define floor accelerations and PSAs with high confidence of non-exceedance. The conclusion was that in-floor variability can be as high as 50–60% and models with sufficiently dense node grids should be used in order to achieve a realistic estimate of floor vibration under seismic action. The effects of the shape of the input spectra, damping, and flexibility of the

  18. A review on reliability assessment for wind power

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Jiang; Zheng, Yan; Donghan, Feng [Department of Electrical Engineering, Shanghai Jiaotong University, Dongchuan Road 800, Shanghai 200240 (China)

    2009-12-15

    The application of wind energy in electric power systems is growing rapidly due to enhanced public concerns to adverse environmental impacts and escalation in energy costs associated with the use of conventional energy sources. Electric power from wind energy is quite different from that of conventional resources. The fundamental difference is that the wind power is intermittent and uncertain. Therefore, it affects the reliability of power system in a different manner from that of the conventional generators. This paper, from available literatures, presents the model of wind farms and the methods of wind speed parameters assessment. Two main categories of methods for evaluating the wind power reliability contribution, i.e., the analytical method and the Monte Carlo simulation method have been reviewed. This paper also summarizes factors affecting the reliability of wind power system, such as wake effect, correlation of output power for different windturbines, effect of windturbine parameters, penetration and environment. An example has been used to illustrate how these factors affect the reliability of wind power system. Finally, mainstream reliability indices for evaluating reliability are introduced. Among these reliability indices, some are recently developed, such as wind generation interrupted energy benefit (WGIEB), wind generation interruption cost benefit (WGICB), Equivalent Capacity Rate (ECR), load carrying capacity benefit ratio (LCCBR). (author)

  19. Method of statistical data processing safety ecological monitoring combined heat and power station in the megalopolis territory

    Directory of Open Access Journals (Sweden)

    Telichenko Valeriy

    2016-01-01

    Full Text Available The first developed method allowed to determine the combined effects of emissions of pollutants into the atmospheric air in the city by combined heat and power stations group, as well as to assess the interference impact of combined heat and power station ion the child population in different districts of Moscow. It has been proven emissions of pollutants by combined heat and power station interference are distributed on the territory of the city and does not equally affect the disease incidence of children in different administrative districts with emissions of each of the combined heat and power stations individually and do not exceed maximum allowable concentrations.

  20. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  1. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  2. An instrument to assess the statistical intensity of medical research papers.

    Directory of Open Access Journals (Sweden)

    Pentti Nieminen

    Full Text Available There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way.A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument.The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%.A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.

  3. An instrument to assess the statistical intensity of medical research papers.

    Science.gov (United States)

    Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu

    2017-01-01

    There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.

  4. Power-law statistics and stellar rotational velocities in the Pleiades

    OpenAIRE

    Carvalho, J. C.; Silva, R.; Nascimento, J. D. Jr. do; De Medeiros, J. R.

    2009-01-01

    In this paper we will show that, the non-gaussian statistics framework based on the Kaniadakis statistics is more appropriate to fit the observed distributions of projected rotational velocity measurements of stars in the Pleiades open cluster. To this end, we compare the results from the $\\kappa$ and $q$-distributions with the Maxwellian.

  5. Developing a PQ monitoring system for assessing power quality and critical areas detection

    Directory of Open Access Journals (Sweden)

    Miguel Romero

    2011-10-01

    Full Text Available This paper outlines the development of a power quality monitoring system. The system is aimed at assessing power quality and detecting critical areas throughout at distribution system. Such system integrates a hardware system and a software processing tool developed in four main stages. Power quality disturbances are registered by PQ meters and the data is transmitted through a 3G wireless network. This data is processed and filtered in an open source database. Some interesting statistical indices related to voltage sags, swells, flicker and voltage unbalance are obtained. The last stage displays the indices geo-referenced on power quality maps, allowing the identification of critical areas according to different criteria. The results can be analyzed using clustering tools to identify differentiated quality groups in a city. The proposed system is an open source tool useful to electricity utilities to analyze and manage large amount of data.

  6. Design of durability test protocol for vehicular fuel cell systems operated in power-follow mode based on statistical results of on-road data

    Science.gov (United States)

    Xu, Liangfei; Reimer, Uwe; Li, Jianqiu; Huang, Haiyan; Hu, Zunyan; Jiang, Hongliang; Janßen, Holger; Ouyang, Minggao; Lehnert, Werner

    2018-02-01

    City buses using polymer electrolyte membrane (PEM) fuel cells are considered to be the most likely fuel cell vehicles to be commercialized in China. The technical specifications of the fuel cell systems (FCSs) these buses are equipped with will differ based on the powertrain configurations and vehicle control strategies, but can generally be classified into the power-follow and soft-run modes. Each mode imposes different levels of electrochemical stress on the fuel cells. Evaluating the aging behavior of fuel cell stacks under the conditions encountered in fuel cell buses requires new durability test protocols based on statistical results obtained during actual driving tests. In this study, we propose a systematic design method for fuel cell durability test protocols that correspond to the power-follow mode based on three parameters for different fuel cell load ranges. The powertrain configurations and control strategy are described herein, followed by a presentation of the statistical data for the duty cycles of FCSs in one city bus in the demonstration project. Assessment protocols are presented based on the statistical results using mathematical optimization methods, and are compared to existing protocols with respect to common factors, such as time at open circuit voltage and root-mean-square power.

  7. Simulating European wind power generation applying statistical downscaling to reanalysis data

    DEFF Research Database (Denmark)

    Gonzalez-Aparicio, I.; Monforti, F.; Volker, Patrick

    2017-01-01

    The growing share of electricity production from solar and mainly wind resources constantly increases the stochastic nature of the power system. Modelling the high share of renewable energy sources and in particular wind power - crucially depends on the adequate representation of the intermittency...

  8. Understanding Statistical Power in Cluster Randomized Trials: Challenges Posed by Differences in Notation and Terminology

    Science.gov (United States)

    Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael

    2014-01-01

    Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…

  9. Statistical power to detect change in a mangrove shoreline fish community adjacent to a nuclear power plant.

    Science.gov (United States)

    Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E

    2016-03-01

    An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.

  10. Entropy and Divergence Associated with Power Function and the Statistical Application

    Directory of Open Access Journals (Sweden)

    Shogo Kato

    2010-02-01

    Full Text Available In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.

  11. The significance of structural power in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Hansen, Anne Merrild; Kørnøv, Lone; Cashmore, Matthew Asa

    2013-01-01

    This article presents a study of how power dynamics enables and constrains the influence of actors upon decision-making and Strategic Environmental Assessment (SEA). Based on Anthony Giddens structuration theory (ST), a model for studying power dynamics in strategic decision-making processes......, that actors influence both outcome and frames for strategic decision making and attention needs to be on not only the formal interactions between SEA process and strategic decision-making process but also on informal interaction and communication between actors. The informal structures shows crucial...... to the outcome of the decision-making process. The article is meant as a supplement to the understanding of power dynamics influence in IA processes emphasising the capacity of agents to mobilise and create change. Despite epistemological challenges of using ST theory as an approach to power analysis, this meta...

  12. Life Cycle Assessment of Coal-fired Power Production

    Energy Technology Data Exchange (ETDEWEB)

    Spath, P. L.; Mann, M. K.; Kerr, D. R.

    1999-09-01

    Coal has the largest share of utility power generation in the US, accounting for approximately 56% of all utility-produced electricity (US DOE, 1998). Therefore, understanding the environmental implications of producing electricity from coal is an important component of any plan to reduce total emissions and resource consumption. A life cycle assessment (LCA) on the production of electricity from coal was performed in order to examine the environmental aspects of current and future pulverized coal boiler systems. Three systems were examined: (1) a plant that represents the average emissions and efficiency of currently operating coal-fired power plants in the US (this tells us about the status quo), (2) a new coal-fired power plant that meets the New Source Performance Standards (NSPS), and (3) a highly advanced coal-fired power plant utilizing a low emission boiler system (LEBS).

  13. Voltage stability margins assessment for Muscat power system

    Energy Technology Data Exchange (ETDEWEB)

    Ellithy, K.A.; Gastli, A. [Sultan Qaboos Univ., Dept. of Electrical Engineering and Electronics, Muscat (Oman); Al-Khusaibi, T. [Ministry of Housing and Electricity and Water, Muscat (Oman); Irving, M. [Brunel Univ., Dept. of Electrical Engineering and Electronics, Uxbridge (United Kingdom)

    2002-10-01

    Voltage instability problems in power systems today are, in many countries, one of the major concerns in power system planning and operation. This paper presents the assessment of voltage stability margins for Muscat power system under normal operating condition and under contingencies. The modal analysis method is applied to identify the weak buses in the system, which could lead to voltage instability. These weak buses are selected as the best locations for applying remedial actions to enhance the stability margins. The results show that the buses at South Batna load area are the weakest buses in the system. The results also show that an increase in load demand on that area without an adequate increase of reactive power could lead to voltage collapse. Shunt VAR compensations (remedial action) are installed at the weakest buses to enhance the system stability margins. The results presented in this paper are obtained using a MATLAB computer program developed by the authors. (Author)

  14. Sex differences in discriminative power of volleyball game-related statistics.

    Science.gov (United States)

    João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime

    2010-12-01

    To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.

  15. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  16. Integrated safety assessment of Indian nuclear power plants for ...

    Indian Academy of Sciences (India)

    Classically the nuclear power reactor design and system evolution has been based on the logic of minimization of risk to an acceptable level and its quantification based on a deterministic approach and backed up by a further assessment based on the probabilistic methodology. However, in spite of minimization of risk, the ...

  17. Optimal Power Allocation for CC-HARQ-based Cognitive Radio with Statistical CSI in Nakagami Slow Fading Channels

    Science.gov (United States)

    Xu, Ding; Li, Qun

    2017-01-01

    This paper addresses the power allocation problem for cognitive radio (CR) based on hybrid-automatic-repeat-request (HARQ) with chase combining (CC) in Nakagamimslow fading channels. We assume that, instead of the perfect instantaneous channel state information (CSI), only the statistical CSI is available at the secondary user (SU) transmitter. The aim is to minimize the SU outage probability under the primary user (PU) interference outage constraint. Using the Lagrange multiplier method, an iterative and recursive algorithm is derived to obtain the optimal power allocation for each transmission round. Extensive numerical results are presented to illustrate the performance of the proposed algorithm.

  18. The N-Pact Factor: Evaluating the Quality of Empirical Journals with Respect to Sample Size and Statistical Power

    Science.gov (United States)

    Fraley, R. Chris; Vazire, Simine

    2014-01-01

    The authors evaluate the quality of research reported in major journals in social-personality psychology by ranking those journals with respect to their N-pact Factors (NF)—the statistical power of the empirical studies they publish to detect typical effect sizes. Power is a particularly important attribute for evaluating research quality because, relative to studies that have low power, studies that have high power are more likely to (a) to provide accurate estimates of effects, (b) to produce literatures with low false positive rates, and (c) to lead to replicable findings. The authors show that the average sample size in social-personality research is 104 and that the power to detect the typical effect size in the field is approximately 50%. Moreover, they show that there is considerable variation among journals in sample sizes and power of the studies they publish, with some journals consistently publishing higher power studies than others. The authors hope that these rankings will be of use to authors who are choosing where to submit their best work, provide hiring and promotion committees with a superior way of quantifying journal quality, and encourage competition among journals to improve their NF rankings. PMID:25296159

  19. Power-law Statistics of Driven Reconnection in the Magnetically Closed Corona

    Science.gov (United States)

    Knizhnik, K. J.; Uritsky, V. M.; Klimchuk, J. A.; DeVore, C. R.

    2018-01-01

    Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.

  20. Limit distributions for the terms of central order statistics under power normalization

    OpenAIRE

    El Sayed M. Nigm

    2007-01-01

    In this paper the limiting distributions for sequences of central terms under power nonrandom normalization are obtained. The classes of the limit types having domain of L- attraction are investigated.

  1. Limit distributions for the terms of central order statistics under power normalization

    Directory of Open Access Journals (Sweden)

    El Sayed M. Nigm

    2007-12-01

    Full Text Available In this paper the limiting distributions for sequences of central terms under power nonrandom normalization are obtained. The classes of the limit types having domain of L- attraction are investigated.

  2. Statistical Analysis of Power Production from OWC Type Wave Energy Converters

    DEFF Research Database (Denmark)

    Martinelli, L.; Zanuttigh, B.; Kofoed, Jens Peter

    2009-01-01

    Oscillating Water Column based wave energy plants built so far have experienced a low efficiency in the conversion of the bidirectional oscillating flow. A new concept is considered here, the LeanCon Wave Energy Converter (WEC), that unifies the flow direction by use of non-return valves......, into a unidirectional flow, making the use of more efficient air turbines possible. Hereby, a more steady flow is also obtained. The general objective of this note is to examine, the power take off (PTO) efficiency under irregular wave conditions, for WECs with flow redirection. Final practical aim is to identify......, the power measured at the modelled PTO is compared with the available incident wave power in order to examine the overall system response in a scaleindependent manner. Then, the power production density function is fitted to a simplified shape, whose parameters are related to the tested sea state conditions...

  3. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  4. Can we successfully monitor a population density decline of elusive invertebrates? A statistical power analysis on Lucanus cervus

    OpenAIRE

    Thomaes, Arno; Verschelde, Pieter; Mader, Detlef; Sprecher-Uebersax, Eva; Fremlin, Maria; Onkelinx, Thierry; Méndez, Marcos

    2017-01-01

    Monitoring global biodiversity is essential for understanding and countering its current loss. However, monitoring of many species is hindered by their difficult detection due to crepuscular activity, hidden phases of the life cycle, short activity period and low population density. Few statistical power analyses of declining trends have been published for terrestrial invertebrates. Consequently, no knowledge exists of the success rate of monitoring elusive invertebrates. Here data from monit...

  5. Study of creep cavity growth for power plant lifetime assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wu Rui; Sandstroem, Rolf

    2001-01-01

    This report aims to the sub project lifetime assessment by creep (livslaengdspredikteringar vid kryp), which is involved in the project package strength in high temperature power plant, KME 708. The physical creep damage includes mainly cavities and their development. Wu and Sandstroem have observed that cavity size increases linearly with increasing creep strain in a 12%Cr steel. Sandstroem has showed that, based on the relations between the nucleation and growth of creep cavities with creep strain, the physical creep damage can be modelled as a function of creep strain. In the present paper the growth of creep cavity radius R in relation to time t and strain {epsilon} in low alloy and 12%Cr steels as well as a Type 347 steel has been studied. The results exhibit that the power law cavity radius with creep time (R-t) and with creep strain (R-{epsilon}) relations are found for these materials at various testing conditions. The power law R-t and R-{epsilon} relations are in most cases dependent and independent on testing conditions, respectively. The empirical power law R-{epsilon} relations give a description of cavity evolution, which can be used for lifetime assessment. Experimental data have also been compared to the estimations by the classical models for cavity growth, including the power law growth due to Hancock, the diffusion growth due to Speight and Harris, the constrained diffusion growths due to Dyson and due to Rice and the enhanced diffusion growth due to Beere. It appears that the constraint diffusion growth models give a reasonable estimation of R-{epsilon} relation in many cases. The diffusion growth model is only applicable for limited cases where the power over t in R-t relation takes about 1/3. The power law and the enhanced diffusion models are found in most cases to overestimate the cavity growth.

  6. Mathematical Safety Assessment Approaches for Thermal Power Plants

    Directory of Open Access Journals (Sweden)

    Zong-Xiao Yang

    2014-01-01

    Full Text Available How to use system analysis methods to identify the hazards in the industrialized process, working environment, and production management for complex industrial processes, such as thermal power plants, is one of the challenges in the systems engineering. A mathematical system safety assessment model is proposed for thermal power plants in this paper by integrating fuzzy analytical hierarchy process, set pair analysis, and system functionality analysis. In the basis of those, the key factors influencing the thermal power plant safety are analyzed. The influence factors are determined based on fuzzy analytical hierarchy process. The connection degree among the factors is obtained by set pair analysis. The system safety preponderant function is constructed through system functionality analysis for inherence properties and nonlinear influence. The decision analysis system is developed by using active server page technology, web resource integration, and cross-platform capabilities for applications to the industrialized process. The availability of proposed safety assessment approach is verified by using an actual thermal power plant, which has improved the enforceability and predictability in enterprise safety assessment.

  7. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    Science.gov (United States)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  8. Statistical analysis on the fluence factor of surveillance test data of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gyeong Geun; Kim, Min Chul; Yoon, Ji Hyun; Lee, Bong Sang; Lim, Sang Yeob; Kwon, Jun Hyun [Nuclear Materials Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    The transition temperature shift (TTS) of the reactor pressure vessel materials is an important factor that determines the lifetime of a nuclear power plant. The prediction of the TTS at the end of a plant’s lifespan is calculated based on the equation of Regulatory Guide 1.99 revision 2 (RG1.99/2) from the US. The fluence factor in the equation was expressed as a power function, and the exponent value was determined by the early surveillance data in the US. Recently, an advanced approach to estimate the TTS was proposed in various countries for nuclear power plants, and Korea is considering the development of a new TTS model. In this study, the TTS trend of the Korean surveillance test results was analyzed using a nonlinear regression model and a mixed-effect model based on the power function. The nonlinear regression model yielded a similar exponent as the power function in the fluence compared with RG1.99/2. The mixed-effect model had a higher value of the exponent and showed superior goodness of fit compared with the nonlinear regression model. Compared with RG1.99/2 and RG1.99/3, the mixed-effect model provided a more accurate prediction of the TTS.

  9. Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.

    2012-08-01

    Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.

  10. Assessing statistical competencies in clinical and translational science education: one size does not fit all.

    Science.gov (United States)

    Oster, Robert A; Lindsell, Christopher J; Welty, Leah J; Mazumdar, Madhu; Thurston, Sally W; Rahbar, Mohammad H; Carter, Rickey E; Pollock, Bradley H; Cucchiara, Andrew J; Kopras, Elizabeth J; Jovanovic, Borko D; Enders, Felicity T

    2015-02-01

    Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? For three types of CTS learners (principal investigator, co-investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. © 2014 Wiley Periodicals, Inc.

  11. A statistical power analysis of woody carbon flux from forest inventory data

    Science.gov (United States)

    James A. Westfall; Christopher W. Woodall; Mark A. Hatfield

    2013-01-01

    At a national scale, the carbon (C) balance of numerous forest ecosystem C pools can be monitored using a stock change approach based on national forest inventory data. Given the potential influence of disturbance events and/or climate change processes, the statistical detection of changes in forest C stocks is paramount to maintaining the net sequestration status of...

  12. Statistical Analysis and Quality Assessment of ChIP-seq Data with DROMPA.

    Science.gov (United States)

    Nakato, Ryuichiro; Shirahige, Katsuhiko

    2018-01-01

    Chromatin immunoprecipitation followed by sequencing (ChIP-seq) analysis can detect protein/DNA-binding and histone-modification sites across an entire genome. As there are various factors during sample preparation that affect the obtained results, multilateral quality assessments are essential. Here, we describe a step-by-step protocol using DROMPA, a program for user-friendly ChIP-seq pipelining. DROMPA can be used for quality assessment, data normalization, visualization, peak calling, and multiple statistical analyses.

  13. Wide Area Measurement Based Security Assessment & Monitoring of Modern Power System: A Danish Power System Case Study

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2013-01-01

    Power System security has become a major concern across the global power system community. This paper presents wide area measurement system (WAMS) based security assessment and monitoring of modern power system. A new three dimensional security index (TDSI) has been proposed for online security m...... demonstrated in DigSILENT PowerFactory environment....

  14. Assessing and Measuring Statistics Cognition in Higher Education Online Environments: Emerging Research and Opportunities

    Science.gov (United States)

    Chase, Justin P.; Yan, Zheng

    2017-01-01

    The ability to effective learn, process, and retain new information is critical to the success of any student. Since mathematics are becoming increasingly more important in our educational systems, it is imperative that we devise an efficient system to measure these types of information recall. "Assessing and Measuring Statistics Cognition in…

  15. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics

    Science.gov (United States)

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...

  16. Using Critical Thinking Drills to Teach and Assess Proficiency in Methodological and Statistical Thinking

    Science.gov (United States)

    Cascio, Ted V.

    2017-01-01

    This study assesses the effectiveness of critical thinking drills (CTDs), a repetitious classroom activity designed to improve methodological and statistical thinking in relation to psychological claims embedded in popular press articles. In each of four separate CTDs, students critically analyzed a brief article reporting a recent psychological…

  17. Life-cycle assessment of high-voltage assets using statistical tool

    NARCIS (Netherlands)

    Chmura, L.A.

    2014-01-01

    Nowadays, utilities are confronted with assets reaching or even exceeding their designed life. This in turn, implies the occurrence of upcoming replacements to assure reliable network operation. In this thesis, the application of statistical tools for life-time and residual life assessment of

  18. Some Statistics for Assessing Person-Fit Based on Continuous-Response Models

    Science.gov (United States)

    Ferrando, Pere Joan

    2010-01-01

    This article proposes several statistics for assessing individual fit based on two unidimensional models for continuous responses: linear factor analysis and Samejima's continuous response model. Both models are approached using a common framework based on underlying response variables and are formulated at the individual level as fixed regression…

  19. Charles E. Land, Ph.D., acclaimed statistical expert on radiation risk assessment, died January 2018

    Science.gov (United States)

    Charles E. Land, Ph.D., an internationally acclaimed statistical expert on radiation risk assessment, died January 25, 2018. He retired in 2009 from the NCI Division of Cancer Epidemiology and Genetics. Dr. Land performed pioneering work in modern radiation dose-response analysis and modeling of low-dose cancer risk.

  20. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  1. Business Statistics and Management Science Online: Teaching Strategies and Assessment of Student Learning

    Science.gov (United States)

    Sebastianelli, Rose; Tamimi, Nabil

    2011-01-01

    Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…

  2. Planck 2013 results. XXI. All-sky Compton parameter power spectrum and high-order statistics

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Genova-Santos, R.T.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. These maps show an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales ($\\ell 500$) the clustered Cosmic Infrared Background (CIB) and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We measure the tSZ power spectrum in angular scales, $0.17^{\\circ} \\lesssim \\theta \\lesssim 3.0^{\\circ}$, that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with additional clear evidence of signal from unresolved clusters and, potentially, diffuse warm baryons. We use the tSZ power spectrum to ...

  3. STATISTICAL ASSESSMENT OF QUANTITATIVE QUALITY PERFORMANCE OF BUILDING PRODUCTS DURING CONFORMITY ASSESSMENT AND CERTIFICATION IN THE FIRE PROTECTION FIELD

    Directory of Open Access Journals (Sweden)

    Otto Dvořák

    2015-10-01

    Full Text Available This paper proposes statistical methods for evaluation of quantitative performance of building products determined by tests for objective decision during conformity assessment/certification in the fire protection field. The procedure is applicable even for other types of products, and for research / development of new material products.

  4. Power quality assessment via MATLAB/SIMULINK-based tool

    Energy Technology Data Exchange (ETDEWEB)

    Popescu, M.; Bitoleanu, A.; Dobriceanu, M.; Linca, M. [Craiova Univ., Craiova (Romania). Faculty of Electromechanical Engineering

    2007-07-01

    Variable speed drives (VSD) are a source of harmonics in electrical systems. A graphical interface for assessing power quality in induction motor and static converter driving systems was presented. The simulation and energetic analysis tool for electrical drive (SEATED) tool was used to evaluate power quantity and quality indices from voltage and current signals at steady-state operation points. The graphical interface tool allowed users to configure the structure of the drive system using different drop-down option menus. A dialogue window was used to configure the parameters of the motor, the DC link circuit, and the transformer and line reactor. A modulation menu was designed to aid in the the simulation of the system using sinusoidal, pulse train, and harmonic cancellation techniques. The steady-state analysis was designed to analyze both the network and motor using Budeanu and Czarnecki definitions and phasorial theory. A case study of an electrical drive with an induction motor and harmonics cancellation inverter fed by a 3-phase rectifier was used to evaluate the interface design. The study showed that the interface power quality assessment method can be used to determine the influence of system parameters on power quality performance. 14 refs., 16 figs.

  5. Quantitative assessment of aquatic impacts of power plants

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Arnold, E.M.; Skalski, J.R.; Fickeisen, D.H.; Baker, K.S.

    1979-08-01

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers.

  6. Discriminatory power of water polo game-related statistics at the 2008 Olympic Games.

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M; Mansilla, Mirella; Tella, Victor

    2011-02-01

    The aims of this study were (1) to compare water polo game-related statistics by context (winning and losing teams) and sex (men and women), and (2) to identify characteristics discriminating the performances for each sex. The game-related statistics of the 64 matches (44 men's and 20 women's) played in the final phase of the Olympic Games held in Beijing in 2008 were analysed. Unpaired t-tests compared winners and losers and men and women, and confidence intervals and effect sizes of the differences were calculated. The results were subjected to a discriminant analysis to identify the differentiating game-related statistics of the winning and losing teams. The results showed the differences between winning and losing men's teams to be in both defence and offence, whereas in women's teams they were only in offence. In men's games, passing (assists), aggressive play (exclusions), centre position effectiveness (centre shots), and goalkeeper defence (goalkeeper-blocked 5-m shots) predominated, whereas in women's games the play was more dynamic (possessions). The variable that most discriminated performance in men was goalkeeper-blocked shots, and in women shooting effectiveness (shots). These results should help coaches when planning training and competition.

  7. Soviet Military Power: An Assessment of the Threat

    Science.gov (United States)

    1988-01-01

    station module, and regular crew rotations with the SOYUZ -TM capsule . the Soviets have probably begun their permanent manned presence in space. The crew... SOVIET SMILITARY POWER" ILE Cop AN ASSESSMENT OF THE THREAT 1988 DTIC AM ECTEE JUN 988 7ob C Be Available Cope The United States Government has not...rec- The illustrations of Soviet military facilities For sale by Superintendent of Documents, ogniied the incorporation of Estonia, Latvia, and weapon

  8. Global Assessment of High-Altitude Wind Power

    OpenAIRE

    Archer, Cristina L.; Ken Caldeira

    2009-01-01

    The available wind power resource worldwide at altitudes between 500 and 12,000 m above ground is assessed for the first time. Twenty-eight years of wind data from the reanalyses by the National Centers for Environmental Prediction and the Department of Energy are analyzed and interpolated to study geographical distributions and persistency of winds at all altitudes. Furthermore, intermittency issues and global climate effects of large-scale extraction of energy from high-altitude winds are i...

  9. Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance

    OpenAIRE

    Colombo, Matteo; Bucher, Leandra; Sprenger, Jan

    2017-01-01

    Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i) the prior credibility of an explanatory hypothesis, (ii) the causal framing of the hypothes...

  10. Statistical power of latent growth curve models to detect quadratic growth.

    Science.gov (United States)

    Diallo, Thierno M O; Morin, Alexandre J S; Parker, Philip D

    2014-06-01

    Latent curve models (LCMs) have been used extensively to analyze longitudinal data. However, little is known about the power of LCMs to detect nonlinear trends when they are present in the data. For this study, we utilized simulated data to investigate the power of LCMs to detect the mean of the quadratic slope, Type I error rates, and rates of nonconvergence during the estimation of quadratic LCMs. Five factors were examined: the number of time points, growth magnitude, interindividual variability, sample size, and the R (2)s of the measured variables. The results showed that the empirical Type I error rates were close to the nominal value of 5 %. The empirical power to detect the mean of the quadratic slope was affected by the simulation factors. Finally, a substantial proportion of samples failed to converge under conditions of no to small variation in the quadratic factor, small sample sizes, and small R (2) of the repeated measures. In general, we recommended that quadratic LCMs be based on samples of (a) at least 250 but ideally 400, when four measurement points are available; (b) at least 100 but ideally 150, when six measurement points are available; (c) at least 50 but ideally 100, when ten measurement points are available.

  11. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  12. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  13. Application and interpretation of multiple statistical tests to evaluate validity of dietary intake assessment methods.

    Science.gov (United States)

    Lombard, Martani J; Steyn, Nelia P; Charlton, Karen E; Senekal, Marjanne

    2015-04-22

    Several statistical tests are currently applied to evaluate validity of dietary intake assessment methods. However, they provide information on different facets of validity. There is also no consensus on types and combinations of tests that should be applied to reflect acceptable validity for intakes. We aimed to 1) conduct a review to identify the tests and interpretation criteria used where dietary assessment methods was validated against a reference method and 2) illustrate the value of and challenges that arise in interpretation of outcomes of multiple statistical tests in assessment of validity using a test data set. An in-depth literature review was undertaken to identify the range of statistical tests used in the validation of quantitative food frequency questionnaires (QFFQs). Four databases were accessed to search for statistical methods and interpretation criteria used in papers focusing on relative validity. The identified tests and interpretation criteria were applied to a data set obtained using a QFFQ and four repeated 24-hour recalls from 47 adults (18-65 years) residing in rural Eastern Cape, South Africa. 102 studies were screened and 60 were included. Six statistical tests were identified; five with one set of interpretation criteria and one with two sets of criteria, resulting in seven possible validity interpretation outcomes. Twenty-one different combinations of these tests were identified, with the majority including three or less tests. Coefficient of correlation was the most commonly used (as a single test or in combination with one or more tests). Results of our application and interpretation of multiple statistical tests to assess validity of energy, macronutrients and selected micronutrients estimates illustrate that for most of the nutrients considered, some outcomes support validity, while others do not. One to three statistical tests may not be sufficient to provide comprehensive insights into various facets of validity. Results of our

  14. Selection for Environmental Variation: a Statistical Analysis and Power Calculations to Detect Response

    DEFF Research Database (Denmark)

    Ibáñez-Escriche, Noelia; Sorensen, Daniel; Waagepetersen, Rasmus

    2008-01-01

    Data from uterine capacity in rabbits (litter size) was analysed to determine whether the environmental variance was partly genetically determined. The fit of a lassical homogeneous variance mixed linear model (HOM model) and of a genetically structured heterogeneous variance mixed linear model...... affecting mean and variance was -0.74(-0.90;-0.52). It is argued that stronger support for the HET model than that derived from statistical analysis of data would be provided by a successful selection experiment designed to modify the environmental variance. A simple selection criterion is suggested...

  15. Assessing the power and quality of epidemiologic studies of asbestos-exposed populations

    Energy Technology Data Exchange (ETDEWEB)

    Davis, D.L.; Mandula, B.; Van Ryzin, J.V.

    1985-12-01

    This paper briefly discusses criteria for evaluating epidemiologic studies for risk assessment purposes, using asbestos as an example. Asbestos is one of the few carcinogens for which substantial data exist on exposures to humans. However, there are major difficulties in using these data for conducting risk assessments. In particular, exposure data are often incomplete, and risk assessments usually involve extrapolating from the higher exposures of the occupational environments to the lower levels typically encountered in the nonoccupational environment. The term asbestos refers to the fibrous form of several minerals, and levels of exposures to these fibers are not easily assessed. Criteria for evaluating epidemiologic studies used in an Ontario Royal Commission report on asbestos are discussed. The importance of considering the statistical power of studies to detect an excess risk is examined using as examples major cohort studies of asbestos-exposed workers, as summarized in a report by the U.S. National Research Council.

  16. Assessing the power and quality of epidemiologic studies of asbestos-exposed populations.

    Science.gov (United States)

    Davis, D L; Mandula, B; Van Ryzin, J V

    1985-12-01

    This paper briefly discusses criteria for evaluating epidemiologic studies for risk assessment purposes, using asbestos as an example. Asbestos is one of the few carcinogens for which substantial data exist on exposures to humans. However, there are major difficulties in using these data for conducting risk assessments. In particular, exposure data are often incomplete, and risk assessments usually involve extrapolating from the higher exposures of the occupational environments to the lower levels typically encountered in the nonoccupational environment. The term "asbestos" refers to the fibrous form of several minerals, and levels of exposures to these fibers are not easily assessed. Criteria for evaluating epidemiologic studies used in an Ontario Royal Commission report on asbestos are discussed. The importance of considering the statistical power of studies to detect an excess risk is examined using as examples major cohort studies of asbestos-exposed workers, as summarized in a report by the U.S. National Research Council.

  17. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature.

    Science.gov (United States)

    Szucs, Denes; Ioannidis, John P A

    2017-03-01

    We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64-1.46) for nominally statistically significant results and D = 0.24 (0.11-0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience.

  18. Blind image quality assessment based on aesthetic and statistical quality-aware features

    Science.gov (United States)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  19. A escolha do teste estatístico - um tutorial em forma de apresentação em PowerPoint A PowerPoint®-based guide to assist in choosing the suitable statistical test

    Directory of Open Access Journals (Sweden)

    David Normando

    2010-02-01

    Full Text Available A seleção de métodos apropriados para a análise estatística pode parecer complexa, principalmente para estudantes de pós-graduação e pesquisadores no início da carreira científica. Por outro lado, a apresentação em PowerPoint é uma ferramenta comum para estudantes e pesquisadores. Assim, um tutorial de Bioestatística desenvolvido em uma apresentação em PowerPoint poderia estreitar a distância entre ortodontistas e a Bioestatística. Esse guia proporciona informações úteis e objetivas a respeito de vários métodos estatísticos empregando exemplos relacionados à Odontologia e, mais especificamente, à Ortodontia. Esse tutorial deve ser empregado, principalmente, para o usuário obter algumas respostas a questões comuns relacionadas ao teste mais apropriado para executar comparações entre grupos, examinar correlações e regressões ou analisar o erro do método. Também pode ser obtido auxílio para checar a distribuição dos dados (normal ou anormal e a escolha do gráfico mais adequado para a apresentação dos resultados. Esse guia* pode ainda ser de bastante utilidade para revisores de periódicos examinarem, de forma rápida, a adequabilidade do método estatístico apresentado em um artigo submetido à publicação.Selecting appropriate methods for statistical analysis may be difficult, especially for the students and others in the early phases of the research career. On the other hand, PowerPoint presentation is a very common tool to researchers and dental students, so a statistical guide based on PowerPoint could narrow the gap between orthodontist and the Biostatistics. This guide provides objective and useful information about several statistical methods using examples related to the dental field. A Power-Point presentation is employed to assist the user to find answers to common questions regarding Biostatistics, such as the most appropriate statistical test to compare groups, to make correlations and

  20. Identifying potentially induced seismicity and assessing statistical significance in Oklahoma and California

    CERN Document Server

    McClure, Mark; Chiu, Kitkwan; Ranganath, Rajesh

    2016-01-01

    In this study, we develop a statistical method for identifying induced seismicity from large datasets and apply the method to decades of wastewater disposal and seismicity data in California and Oklahoma. The method is robust against a variety of potential pitfalls. The study regions are divided into gridblocks. We use a longitudinal study design, seeking associations between seismicity and wastewater injection along time-series within each gridblock. The longitudinal design helps control for non-random application of wastewater injection. We define a statistical model that is flexible enough to describe the seismicity observations, which have temporal correlation and high kurtosis. In each gridblock, we find the maximum likelihood estimate for a model parameter that relates induced seismicity hazard to total volume of wastewater injected each year. To assess significance, we compute likelihood ratio test statistics in each gridblock and each state, California and Oklahoma. Resampling is used to empirically d...

  1. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2017-02-07

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  2. Quadrennial Technology Review 2015: Technology Assessments--Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2015-10-07

    Wind power has become a mainstream power source in the U.S. electricity portfolio, supplying 4.9% of the nation’s electricity demand in 2014. With more than 65 GW installed across 39 states at the end of 2014, utility-scale wind power is a cost-effective source of low-emissions power generation throughout much of the nation. The United States has significant sustainable land-based and offshore wind resource potential, greater than 10 times current total U.S. electricity consumption. A technical wind resource assessment conducted by the Department of Energy (DOE) in 2009 estimated that the land-based wind energy potential for the contiguous United States is equivalent to 10,500 GW capacity at 80 meters (m) hub and 12,000 GW capacity at 100 meters (m) hub heights, assuming a capacity factor of at least 30%. A subsequent 2010 DOE report estimated the technical offshore wind energy potential to be 4,150 GW. The estimate was calculated from the total offshore area within 50 nautical miles of shore in areas where average annual wind speeds are at least 7 m per second at a hub height of 90 m.

  3. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-01-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  4. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-11-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  5. Preliminary environmental assessment for the satellite power system (SPS)

    Energy Technology Data Exchange (ETDEWEB)

    1978-10-01

    A preliminary assessment of the impact of the Satellite Power System (SPS) on the environment is presented. Information that has appeared in documents referenced herein is integrated and assimilated. The state-of-knowledge as perceived from recently completed DOE-sponsored studies is disclosed, and prospective research and study programs that can advance the state-of-knowledge and provide an expanded data base for use in an assessment planned for 1980 are defined. Alternatives for research that may be implemented in order to achieve this advancement are also discussed in order that a plan can be selected which will be consistent with the fiscal and time constraints on the SPS Environmental Assessment Program. Health and ecological effects of microwave radiation, nonmicrowave effects on health and the environment (terrestrial operations and space operations), effects on the atmosphere, and effects on communications systems are examined in detail. (WHK)

  6. Automated clarity assessment of retinal images using regionally based structural and statistical measures.

    Science.gov (United States)

    Fleming, Alan D; Philip, Sam; Goatman, Keith A; Sharp, Peter F; Olson, John A

    2012-09-01

    An automated image analysis system for application in mass medical screening must assess the clarity of the images before analysing their content. This is the case in grading for diabetic retinopathy screening where the failure to assess clarity could result in retinal images of people with retinopathy being erroneously classed as normal. This paper compares methods of clarity assessment based on the degradation of visible structures and based on the deviation of image properties outside expected norms caused by clarity loss. Vessel visibility measures and statistical measures were determined at locations in the image which have high saliency and these were used to obtain an image clarity assessment using supervised classification. The usefulness of the measures as indicators of image clarity was assessed. Tests were performed on 987 disc-centred and macula-centred retinal photographs (347 with inadequate clarity) obtained from the English National Screening Programme. Images with inadequate clarity were detected with 92.6% sensitivity at 90% specificity. In a set of 2000 macula-centred images (200 with inadequate clarity) from the Scottish Screening Programme, inadequate clarity was detected with 96.7% sensitivity at 90% specificity. This study has shown that structural and statistical measures are equally useful for retinal image clarity assessment. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Statistical learning for wind power: A modeling and stability study towards forecasting

    Science.gov (United States)

    Fischer, Aurélie; Montuelle, Lucie; Mougeot, Mathilde; Picard, Dominique

    2017-12-01

    We focus on wind power modeling using machine learning techniques. We show on real data provided by the wind energy company Ma{\\"i}a Eolis, that parametric models, even following closely the physical equation relating wind production to wind speed are outperformed by intelligent learning algorithms. In particular, the CART-Bagging algorithm gives very stable and promising results. Besides, as a step towards forecast, we quantify the impact of using deteriorated wind measures on the performances. We show also on this application that the default methodology to select a subset of predictors provided in the standard random forest package can be refined, especially when there exists among the predictors one variable which has a major impact.

  8. Studies with group treatments required special power calculations, allocation methods, and statistical analyses.

    Science.gov (United States)

    Faes, Miriam C; Reelick, Miriam F; Perry, Marieke; Olde Rikkert, Marcel G M; Borm, George F

    2012-02-01

    In some trials, the intervention is delivered to individuals in groups, for example, groups that exercise together. The group structure of such trials has to be taken into consideration in the analysis and has an impact on the power of the trial. Our aim was to provide optimal methods for the design and analysis of such trials. We described various treatment allocation methods and presented a new allocation algorithm: optimal batchwise minimization (OBM). We carried out a simulation study to evaluate the performance of unrestricted randomization, stratification, permuted block randomization, deterministic minimization, and OBM. Furthermore, we described appropriate analysis methods and derived a formula to calculate the study size. Stratification, deterministic minimization, and OBM had considerably less risk of imbalance than unrestricted randomization and permuted block randomization. Furthermore, OBM led to unpredictable treatment allocation. The sample size calculation and the analysis of the study must be based on a multilevel model that takes the group structure of the trial into account. Trials evaluating interventions that are carried out in subsequent groups require adapted treatment allocation, power calculation, and analysis methods. From the perspective of obtaining overall balance, we conclude that minimization is the method of choice. When the number of prognostic factors is low, stratification is an excellent alternative. OBM leads to better balance within the batches, but it is more complicated. It is probably most worthwhile in trials with many prognostic factors. From the perspective of predictability, a treatment allocation method, such as OBM, that allocates several subjects at the same time, is superior to other methods because it leads to the lowest possible predictability. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Security assessment for intentional island operation in modern power system

    DEFF Research Database (Denmark)

    Chen, Yu; Xu, Zhao; Østergaard, Jacob

    2011-01-01

    operator can clearly know if it is suitable to conduct island operation at one specific moment. Besides, in order to improve the computation efficiency, the Artificial Neural Network (ANN) is applied for fast ISR formation. Thus, online application of ISR based islanding security assessment could......There has been a high penetration level of Distributed Generations (DGs) in distribution systems in Denmark. Even more DGs are expected to be installed in the coming years. With that, to utilize them in maintaining the security of power supply is of great concern for Danish utilities. During...... the emergency in the power system, some distribution networks may be intentionally separated from the main grid to avoid complete system collapse. If DGs in those networks could continuously run instead of immediately being shut down, the blackout could be avoided and the reliability of supply could...

  10. Technical assessment of an aeroelectric solar power concept

    Energy Technology Data Exchange (ETDEWEB)

    James, E C; Zukoski, E; Wormeck, J

    1981-02-01

    The aeroelectric solar power concept has been evaluated. The evaluation is based on a one-dimensional flow analysis which invokes the conservation of mass, momentum and energy of the fluid mixture (air, water vapor and water droplets) flowing through the powerplant. A performance evaluation computer code is developed which can be used to assess the concept under diverse conditions and in preliminary design. For purposes of this evaluation, the geometry of the powerplant has been specified. Aerodynamic flow losses have been estimated using a compendium of pipe flow data for each component of the power plant. These losses are utilized in the flow analysis. Flow losses have been estimated to be approximately one-third of the stream's dynamic pressure (1/2 pu/sup 2/) in the tower's cylinder section. Geometric or configuration changes can be made to reduce aerodynamic loss.

  11. POWER LOSSES ASSESSMENT IN TRANSFORMERS AFTER THE NORMATIVE OPERATING PERIOD

    Directory of Open Access Journals (Sweden)

    M. I. Fursanov

    2015-01-01

    Full Text Available The capacity losses values both loading and off-load are topmost parameters characterizing the distribution mains customers’ transformers operating effectiveness. Precise determination of the specified values facilitates substantiated choice of the optimizing procedures. The actuality of the given topic increases owing to the fact that the modern electric grid utilizes plenty of the oil-transformers whose time in commission considerably exceeds the statutory 25 years. Under the conditions of continued operation the power-losses measurement according to the functioning guidelines does not seem always possible.The authors present an improved power-losses assessment technique based on the currently accepted thermal model of the oil-transformer. They indicate the deficiency of the existing technique and substantiate some of the changes in practical application of the mathematical model. The article makes an emphasis on peculiarities of the temperature changes in the oil-transformer and offers a prototype device of open architecture for realizing the improved technique of the power-losses measurement. The paper describes the device design features and functionality options and depicts its sketchy schematic. The authors note the potential of additional to assessing the power-losses volume, transmitting the obtained information to the dispatcher  via  GSM-connection  for  simplification  of  the  transformer  status  monitoring; as well as the capability of integrating the device into the system of the transformer thermal protection. The practical merit and application scope of the obtained results are in development and choice of the optimizing measures to be taken in the distributive electrical grids, e. g. the transformer replacement.

  12. Iterative Assessment of Statistically-Oriented and Standard Algorithms for Determining Muscle Onset with Intramuscular Electromyography

    Science.gov (United States)

    2017-12-01

    EMG), is a commonly applied metric in biomechanics. Intramuscular EMG is often used to examine deep musculature and there are currently no studies...assessment of deep musculature or small superficial muscles.4 The onset timing of intramuscular EMG has been used exten- sively to examine the “core muscles...this work , no more than 3 parameters were systematically modified for each algorithm. Statistical Analysis In total, 605 standard and novel algorithm

  13. Statistical Assessment of the Effectiveness of Transformation Change (by Case of Singapore

    Directory of Open Access Journals (Sweden)

    Zhuravlyov

    2017-02-01

    Full Text Available In studies of economic transformations and their statistical assessment, the causality of processes specific to economic relations and development of institutions is overlooked. The article is devoted to the important topic of statistical assessment of the transformations effectiveness. The case of Singapore is taken because it is an Asian country demonstrating the essential role of the institutional environment in the national economy transformations. The regression analysis of the impact of institutional factors on economic growth in Singapore is made using 17 indicators: civil freedoms, corruption, economic freedom, economic globalization, spending on education, use of energy, share of women at labor market, fiscal freedom, price for fuel, PPP, effectiveness of public administration, level of consumption, Human Development Index, Internet users, life expectancy, unemployment, openness of trade. Economic interpretation of the statistical assessment of economic transformations in Singapore is as follows: quality of the institutional environment (control of corruption, economic freedom, supremacy of law etc. has critical importance for economic development in Singapore; the increasing spending on education has positive effects for economic growth in Singapore; economic growth in Singapore has high positive correlation with energy consumption.

  14. Observer variability in the assessment of type and dysplasia of colorectal adenomas, analyzed using kappa statistics

    DEFF Research Database (Denmark)

    Jensen, P; Krogsgaard, M R; Christiansen, J

    1995-01-01

    of adenomas were assessed twice by three experienced pathologists, with an interval of two months. Results were analyzed using kappa statistics. RESULTS: For agreement between first and second assessment (both type and grade of dysplasia), kappa values for the three specialists were 0.5345, 0.9022, and 0.......4100, respectively. Agreement was better for type than for dysplasia. The strength of agreement was moderate for Observers A and C and almost perfect for Observer B. Agreement between all three observers was seen in 35.2 percent for both type and dysplasia in 61 percent for type and in 47.8 percent for dysplasia...

  15. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    Directory of Open Access Journals (Sweden)

    Amzal Billy

    2011-02-01

    Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.

  16. Wind power prognosis statistical system; Sistema estadistico de pronostico de la energia eoloelectrica

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Garcia, Alfredo; De la Torre Vega, Eli [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2009-07-01

    The integration of the first Aeolian farm of large scale (La Venta II) to the National Interconnected System requires taking into account the random and discontinuous nature of the Aeolian energy. An important tool, for this task, is a system for the prognosis of the Aeolian energy in the short term. For this reason, the Instituto of Investigaciones Electricas (IIE) developed a statistical model to realize this prognosis. The prediction is done through an adaptable linear combination of alternative competing models, where the weights given to each model are based on its more recent prognosis quality. Also, the application results of the prognoses system are presented and analyzed. [Spanish] La integracion de la primera grana eolica de gran escala (La Venta II) al Sistema Interconectado Nacional requiere tomar en cuenta la naturaleza aleatoria y discontinua de la energia eolica. Una importante herramienta, para esta tarea, es un sistema para el pronostico de la energia eolica a corto plazo. Por ello, el Instituto de Investigaciones Electricas (IIE) desarrollo un modelo estadistico para realizar este pronostico. La prediccion es hecha a traves de una combinacion lineal adaptable de modelos competidores alternativos, donde los pesos dados a cada modelo estan basados en su mas reciente calidad de pronostico. Tambien se presentan y analizan los resultados de la aplicacion del sistema de pronosticos.

  17. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    CERN Document Server

    Wheatley, Spencer; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...

  18. A Framework for Assessing the Commercialization of Photovoltaic Power Generation

    Science.gov (United States)

    Yaqub, Mahdi

    An effective framework does not currently exist with which to assess the viability of commercializing photovoltaic (PV) power generation in the US energy market. Adopting a new technology, such as utility-scale PV power generation, requires a commercialization assessment framework. The framework developed here assesses the economic viability of a set of alternatives of identified factors. Economic viability focuses on simulating the levelized cost of electricity (LCOE) as a key performance measure to realize `grid parity', or the equivalence between the PV electricity prices and grid electricity prices for established energy technologies. Simulation results confirm that `grid parity' could be achieved without the current federal 30% investment tax credit (ITC) via a combination of three strategies: 1) using economies of scale to reduce the LCOE by 30% from its current value of 3.6 cents/kWh to 2.5 cents/kWh, 2) employing a longer power purchase agreement (PPA) over 30 years at a 4% interest rate, and 3) improving by 15% the "capacity factor", which is the ratio of the total annual generated energy to the full potential annual generation when the utility is continuously operating at its rated output. The lower than commercial-market interest rate of 4% that is needed to realize `grid parity' is intended to replace the current federal 30% ITC subsidy, which does not have a cash inflow to offset the outflow of subsidy payments. The 4% interest rate can be realized through two proposed finance plans: The first plan involves the implementation of carbon fees on polluting power plants to produce the capital needed to lower the utility PPA loan term interest rate from its current 7% to the necessary 4% rate. The second plan entails a proposed public debt finance plan. Under this plan, the US Government leverages its guarantee power to issue bonds and uses the proceeds to finance the construction and operation of PV power plants with PPA loan with a 4% interest rate for a

  19. Application of multivariate statistical analysis to superficial soils around a coal burning power plant

    Directory of Open Access Journals (Sweden)

    Godoy Maria Luiza D. P

    2004-01-01

    Full Text Available The Thermoelectric Complex Jorge Lacerda (TCJL, located in the Santa Catarina State, Brazil, is the largest coal burning thermoelectric complex of Latin America and consists of seven power plants with a total capacity of 832 MWe. In order to estimate the contribution of the atmospheric releases from the TCJL to the elemental composition of surface soils around it, forty-five samples were collected at up to a distance of 8 km. Forty-two elements were determined by ICP-MS and ICP-AES after total acid dissolution. The technique of principal component analysis was employed to identify the major sources that contribute to surface soil composition. Additionally, a source apportioning using multiple regression on absolute principal component scores was performed in order to obtain quantitative information about the contribution of the different identified sources on the soil composition. Based on the results obtained, four sources were identified as the main contributors to the surface soil elemental composition. One of them was related to TCJL because it retains volatile elements enriched on fly ash and released from powerhouse stacks.

  20. Communications and control for electric power systems: Power flow classification for static security assessment

    Science.gov (United States)

    Niebur, D.; Germond, A.

    1993-01-01

    This report investigates the classification of power system states using an artificial neural network model, Kohonen's self-organizing feature map. The ultimate goal of this classification is to assess power system static security in real-time. Kohonen's self-organizing feature map is an unsupervised neural network which maps N-dimensional input vectors to an array of M neurons. After learning, the synaptic weight vectors exhibit a topological organization which represents the relationship between the vectors of the training set. This learning is unsupervised, which means that the number and size of the classes are not specified beforehand. In the application developed in this report, the input vectors used as the training set are generated by off-line load-flow simulations. The learning algorithm and the results of the organization are discussed.

  1. Global Assessment of High-Altitude Wind Power

    Directory of Open Access Journals (Sweden)

    Cristina L. Archer

    2009-05-01

    Full Text Available The available wind power resource worldwide at altitudes between 500 and 12,000 m above ground is assessed for the first time. Twenty-eight years of wind data from the reanalyses by the National Centers for Environmental Prediction and the Department of Energy are analyzed and interpolated to study geographical distributions and persistency of winds at all altitudes. Furthermore, intermittency issues and global climate effects of large-scale extraction of energy from high-altitude winds are investigated.

  2. Biological and statistical processes jointly drive population aggregation: using host-parasite interactions to understand Taylor's power law.

    Science.gov (United States)

    Johnson, Pieter T J; Wilber, Mark Q

    2017-09-27

    The macroecological pattern known as Taylor's power law (TPL) represents the pervasive tendency of the variance in population density to increase as a power function of the mean. Despite empirical illustrations in systems ranging from viruses to vertebrates, the biological significance of this relationship continues to be debated. Here we combined collection of a unique dataset involving 11 987 amphibian hosts and 332 684 trematode parasites with experimental measurements of core epidemiological outcomes to explicitly test the contributions of hypothesized biological processes in driving aggregation. After using feasible set theory to account for mechanisms acting indirectly on aggregation and statistical constraints inherent to the data, we detected strongly consistent influences of host and parasite species identity over 7 years of sampling. Incorporation of field-based measurements of host body size, its variance and spatial heterogeneity in host density accounted for host identity effects, while experimental quantification of infection competence (and especially virulence from the 20 most common host-parasite combinations) revealed the role of species-by-environment interactions. By uniting constraint-based theory, controlled experiments and community-based field surveys, we illustrate the joint influences of biological and statistical processes on parasite aggregation and emphasize their importance for understanding population regulation and ecological stability across a range of systems, both infectious and free-living. © 2017 The Author(s).

  3. 75 FR 11205 - Entergy Nuclear Operations, Inc; Pilgrim Nuclear Power Station Environmental Assessment and...

    Science.gov (United States)

    2010-03-10

    ... COMMISSION Entergy Nuclear Operations, Inc; Pilgrim Nuclear Power Station Environmental Assessment and... Nuclear Operations, Inc. (Entergy or the licensee), for operation of Pilgrim Nuclear Power Station... Nuclear Power Station,'' NUREG-1437, Supplement 29, published in July 2007 (ADAMS Accession No...

  4. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    Science.gov (United States)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  5. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    Directory of Open Access Journals (Sweden)

    S. C. Oliveira

    2017-07-01

    Full Text Available Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2 located north of Lisbon (Portugal, using a statistical method (the information value method, IV and a physically based method (the infinite slope method, IS. The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  6. Safety Assessment of Nuclear Power Plants for Liquefaction Consequences

    Directory of Open Access Journals (Sweden)

    Tamás János Katona

    2015-01-01

    Full Text Available In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be analysed as beyond design basis hazard. The aim of the analysis is to define the postevent condition of the plant, definition of plant vulnerabilities, and identification of the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The procedure includes identification of the scope of the safety analysis and the acceptable limit cases for plant structures having different role from accident management point of view. Considerations are made for identification of dominating effects of liquefaction. The possibility of the decoupling of the analysis of liquefaction effects from the analysis of vibratory ground motion is discussed. It is shown in the paper that the practicable empirical methods for definition of liquefaction susceptibility provide rather controversial results. Selection of method for assessment of soil behaviour that affects the integrity of structures requires specific considerations. The case of nuclear power plant at Paks, Hungary, is used as an example for demonstration of practical importance of the presented results and considerations.

  7. Power plant system assessment. Final report. SP-100 Program

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, R.V.; Atkins, D.F.; Bost, D.S.; Berman, B.; Clinger, D.A.; Determan, W.R.; Drucker, G.S.; Glasgow, L.E.; Hartung, J.A.; Harty, R.B.

    1983-10-31

    The purpose of this assessment was to provide system-level insights into 100-kWe-class space reactor electric systems. Using these insights, Rockwell was to select and perform conceptual design studies on a ''most attractive'' system that met the preliminary design goals and requirements of the SP-100 Program. About 4 of the 6 months were used in the selection process. The remaining 2 months were used for the system conceptual design studies. Rockwell completed these studies at the end of FY 1983. This report summarizes the results of the power plant system assessment and describes our choice for the most attractive system - the Rockwell SR-100G System (Space Reactor, 100 kWe, Growth) - a lithium-cooled UN-fueled fast reactor/Brayton turboelectric converter system.

  8. Rainfall Downscaling Conditional on Upper-air Variables: Assessing Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Deidda, Roberto; Marrocu, Marino; Kaleris, Vassilios

    2014-05-01

    Due to its intermittent and highly variable character, and the modeling parameterizations used, precipitation is one of the least well reproduced hydrologic variables by both Global Climate Models (GCMs) and Regional Climate Models (RCMs). This is especially the case at a regional level (where hydrologic risks are assessed) and at small temporal scales (e.g. daily) used to run hydrologic models. In an effort to remedy those shortcomings and assess the effect of climate change on rainfall statistics at hydrologically relevant scales, Langousis and Kaleris (2013) developed a statistical framework for simulation of daily rainfall intensities conditional on upper air variables. The developed downscaling scheme was tested using atmospheric data from the ERA-Interim archive (http://www.ecmwf.int/research/era/do/get/index), and daily rainfall measurements from western Greece, and was proved capable of reproducing several statistical properties of actual rainfall records, at both annual and seasonal levels. This was done solely by conditioning rainfall simulation on a vector of atmospheric predictors, properly selected to reflect the relative influence of upper-air variables on ground-level rainfall statistics. In this study, we apply the developed framework for conditional rainfall simulation using atmospheric data from different GCM/RCM combinations. This is done using atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com), and daily rainfall measurements for an intermediate-sized catchment in Italy; i.e. the Flumendosa catchment. Since GCM/RCM products are suited to reproduce the local climatology in a statistical sense (i.e. in terms of relative frequencies), rather than ensuring a one-to-one temporal correspondence between observed and simulated fields (i.e. as is the case for ERA-interim reanalysis data), we proceed in three steps: a) we use statistical tools to establish a linkage between ERA-Interim upper-air atmospheric forecasts and

  9. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents and Accidents.

    Science.gov (United States)

    Wheatley, Spencer; Sovacool, Benjamin; Sornette, Didier

    2017-01-01

    We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5-0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the "dragon-king" phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60-150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10-20 years. Further-even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima-the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following

  10. Sea cliff instability susceptibility at regional scale: a statistically based assessment in the southern Algarve, Portugal

    Science.gov (United States)

    Marques, F. M. S. F.; Matildes, R.; Redweik, P.

    2013-12-01

    Sea cliff evolution is dominated by the occurrence of slope mass movements of different types and sizes, which are a considerable source of natural hazard, making their assessment a relevant issue in terms of human loss prevention and land use regulations. To address the assessment of the spatial component of sea cliff hazards, i.e. the susceptibility, a statistically based study was made to assess the capacity of a set of conditioning factors to express the occurrence of sea cliff failures affecting areas located along their top. The study was based on the application of the bivariate information value and multivariate logistic regression statistical methods, using a set of predisposing factors for cliff failures, mainly related to geology (lithology, bedding dip, faults) and geomorphology (maximum and mean slope, height, aspect, plan curvature, toe protection), which were correlated with a photogrammetry-based inventory of cliff failures that occurred in a 60 yr period (1947-2007). The susceptibility models were validated against the inventory data using standard success rate and ROC curves, and provided encouraging results, indicating that the proposed approaches are effective for susceptibility assessment. The results obtained also stress the need for improvement of the predisposing factors to be used in this type of study and the need for detailed and systematic cliff failure inventories.

  11. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  12. Implementation of Statistics Textbook Support with ICT and Portfolio Assessment Approach to Improve Students Teacher Mathematical Connection Skills

    Science.gov (United States)

    Hendikawati, P.; Dewi, N. R.

    2017-04-01

    Statistics needed for use in the data analysis process and had a comprehensive implementation in daily life so that students must master the well statistical material. The use of Statistics textbook support with ICT and portfolio assessment approach was expected to help the students to improve mathematical connection skills. The subject of this research was 30 student teachers who take Statistics courses. The results of this research are the use of Statistics textbook support with ICT and portfolio assessment approach can improve students mathematical connection skills.

  13. Use assessment of electronic power sources for SMAW

    Directory of Open Access Journals (Sweden)

    Scotti, A.

    1999-04-01

    Full Text Available The aim of the present work was to assess the efficacy of the use of modern technologies for power supplies in Shielded Metal Are Welding (SMAW. Coupon tests were welded by using a series of five different classes of commercial electrodes, covering their current ranges. Both a conventional electromagnetic and an electronic (inverter power sources were employed. Fusion rate, deposition efficiency, bead finish and weld geometry were measured at each experiment. Current and voltage signals were acquired at a high rate to evaluate the dynamic behavior of the power sources. The static performances of both power sources were also determined. The results showed that despite the remarkable differences between the power supplies, based on static and dynamic characterizations, no significant difference was noticed in the operational behavior of the electrodes, in the given conditions, apart from a better anti-stick performance obtained with the electronic power source.

    El objetivo del presente trabajo fue evaluar la eficacia del uso de tecnologías modernas para fuentes de energía en soldaduras con electrodo revestido (Shielded Metal Are Welding -SMAW-. Los materiales de ensayo se soldaron usando una serie de cinco clases diferentes de electrodos comerciales, cubriendo sus rangos de corriente. Para esto se utilizó una fuente de energía electromagnética convencional y una fuente de energía electrónica (inversora. La tasa de fusión, eficiencia de deposición, terminación del cordón así como el diseño de la soldadura se midieron en cada experimento. Las señales de corriente y voltaje se obtuvieron a una proporción alta para evaluar el comportamiento dinámico de las fuentes de energía. También se determinó la actuación estática de ambas fuentes. Los resultados mostraron que a pesar de las diferencias notables entre los suministros de energía, no se nota diferencia alguna significante en la conducta de trabajo de los electrodos, en

  14. The Development of Statistics Textbook Supported with ICT and Portfolio-Based Assessment

    Science.gov (United States)

    Hendikawati, Putriaji; Yuni Arini, Florentina

    2016-02-01

    This research was development research that aimed to develop and produce a Statistics textbook model that supported with information and communication technology (ICT) and Portfolio-Based Assessment. This book was designed for students of mathematics at the college to improve students’ ability in mathematical connection and communication. There were three stages in this research i.e. define, design, and develop. The textbooks consisted of 10 chapters which each chapter contains introduction, core materials and include examples and exercises. The textbook developed phase begins with the early stages of designed the book (draft 1) which then validated by experts. Revision of draft 1 produced draft 2 which then limited test for readability test book. Furthermore, revision of draft 2 produced textbook draft 3 which simulated on a small sample to produce a valid model textbook. The data were analysed with descriptive statistics. The analysis showed that the Statistics textbook model that supported with ICT and Portfolio-Based Assessment valid and fill up the criteria of practicality.

  15. Statistical analysis for assessing shallow-landslide susceptibility in South Tyrol (south-eastern Alps, Italy)

    Science.gov (United States)

    Piacentini, Daniela; Troiani, Francesco; Soldati, Mauro; Notarnicola, Claudia; Savelli, Daniele; Schneiderbauer, Stefan; Strada, Claudia

    2012-05-01

    This paper conducts a statistical analysis to determine shallow-landslide susceptibility in an approximately 7500-km2 region of the south-eastern Alps (South Tyrol, Italy). The study applies the weight of evidence (WofE) method, which is useful in determining landslide susceptibility in large areas with complex geological and geomorphological settings. The statistical analysis and landslide susceptibility mapping are based on 882 past landslides, three geometric/topographic factors and two anthropogenic factors, which are the most relevant landslide predisposing factors. The quality of the proposed model, particularly the fitting performance, was assessed; the landslide database was divided into a training set to obtain the model and a validation set to estimate the model quality. The results show that the developed susceptibility model predicts an acceptable percentage (75%) of landslides. Therefore, the model can be useful and reliable for land planners and decision makers also due to its cost-effectiveness ratio.

  16. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  17. Assessing Variability of Complex Descriptive Statistics in Monte Carlo Studies using Resampling Methods.

    Science.gov (United States)

    Boos, Dennis D; Osborne, Jason A

    2015-08-01

    Good statistical practice dictates that summaries in Monte Carlo studies should always be accompanied by standard errors. Those standard errors are easy to provide for summaries that are sample means over the replications of the Monte Carlo output: for example, bias estimates, power estimates for tests, and mean squared error estimates. But often more complex summaries are of interest: medians (often displayed in boxplots), sample variances, ratios of sample variances, and non-normality measures like skewness and kurtosis. In principle standard errors for most of these latter summaries may be derived from the Delta Method, but that extra step is often a barrier for standard errors to be provided. Here we highlight the simplicity of using the jackknife and bootstrap to compute these standard errors, even when the summaries are somewhat complicated.

  18. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Rankin...

  19. Wind power potential assessment for three locations in Algeria

    Energy Technology Data Exchange (ETDEWEB)

    Himri, Y. [Electricity and Gas National Enterprise (Sonelgaz), Bechar (Algeria); Rehman, S. [Engineering Analysis Section, Center for Engineering Research, Research Institute, King Fahd University of Petroleum and Minerals, Box 767, Dhahran 31261 (Saudi Arabia); Draoui, B. [Department of Mechanical Engineering, University of Bechar (Algeria); Himri, S. [Department of Fundamental Sciences, University of Bechar (Algeria)

    2008-12-15

    This paper utilized wind speed data over a period of almost 10 years between 1977 and 1988 from three stations, namely Adrar, Timimoun and Tindouf to assess the wind power potential at these sites. The long-term annual mean wind speed values along with the wind turbine power curve values were used to estimate the annual energy output for a 30 MW installed capacity wind farm at each site. A total of 30 wind turbines each of 1000 kW rated power were used in the analysis. The long-term mean wind speed at Adrar, Timimoun and Tindouf was 5.9, 5.1 and 4.3 m/s at 10 m above ground level (AGL), respectively. Higher wind speeds were observed in the day time between 09:00 and 18:00 h and relatively smaller during rest of the period. Wind farms of 30 MW installed capacity at Adrar, Timimoun and Tindouf, if developed, could produce 98,832, 78,138 and 56,040 MWh of electricity annually taking into consideration the temperature and pressure adjustment coefficients of about 6% and all other losses of about 10%, respectively. The plant capacity factors at Adrar, Timimoun and Tindouf were found to be 38%, 30% and 21%, respectively. Finally, the cost of energy (COE) was found to be 3.1, 4.3 and 6.6 US cents/kWh at Adrar, Timimoun and Tindouf, respectively. It was noticed that such a development at these sites could result into avoidance of 48,577, 38,406 and 27,544 tons/year of CO{sub 2} equivalents green house gas (GHG) from entering into the local atmosphere, thus creating a clean and healthy atmosphere for local inhabitants. (author)

  20. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy`s Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement.

  1. Assessment technique for acne treatments based on statistical parameters of skin thermal images.

    Science.gov (United States)

    Padilla-Medina, J Alfredo; León-Ordoñez, Francisco; Prado-Olivarez, Juan; Vela-Aguirre, Noe; Ramírez-Agundis, Agustin; Díaz-Carmona, Javier

    2014-04-01

    Acne vulgaris as an inflammatory disease, with an excessive production of subdermal fat, modifies the dynamics of the bloodstream, and consequently temperature, on the affected skin zone. A high percentage of this heat interchange is manifested as electromagnetic radiation with far-infrared wavelengths, which can be captured through a thermal imaging camera. A technique based on thermal image analysis for efficiency assessment in acne vulgaris is described. The procedure is based on computing statistical parameters of thermal images captured from the affected skin zone being attended by an acne treatment. The proposed technique was used to determine the skin thermal behavior according to acne severity levels in different acne treatment stages. Infrared images of acne skin zones on eight patients, diagnosed with acne vulgaris and attended by one specific acne treatment, were weekly registered during 11 weeks. The infrared images were captured until no more improvement in affected zones was detected. The obtained results suggest a direct relationship between the used statistical parameters, particularly first- and second-order statistics, and the acne vulgaris severity level on the affected zones.

  2. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  3. Assessment on thermoelectric power factor in silicon nanowire networks

    Energy Technology Data Exchange (ETDEWEB)

    Lohn, Andrew J.; Kobayashi, Nobuhiko P. [Baskin School of Engineering, University of California Santa Cruz, CA (United States); Nanostructured Energy Conversion Technology and Research (NECTAR), Advanced Studies Laboratories, University of California Santa Cruz, NASA Ames Research Center, Moffett Field, CA (United States); Coleman, Elane; Tompa, Gary S. [Structured Materials Industries, Inc., Piscataway, NJ (United States)

    2012-01-15

    Thermoelectric devices based on three-dimensional networks of highly interconnected silicon nanowires were fabricated and the parameters that contribute to the power factor, namely the Seebeck coefficient and electrical conductivity were assessed. The large area (2 cm x 2 cm) devices were fabricated at low cost utilizing a highly scalable process involving silicon nanowires grown on steel substrates. Temperature dependence of the Seebeck coefficient was found to be weak over the range of 20-80 C at approximately -400 {mu}V/K for unintentionally doped devices and {+-}50 {mu}V/K for p-type and n-type devices, respectively. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  4. On-line Dynamic Security Assessment in Power Systems

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel

    angular divergence of a group of generators can cause critical voltage sags at certain buses in the system. In this thesis assessment of such voltage sags using two types of sensitivities, which are derived from the algebraic network equations, is proposed. These sensitivities are derived after an in......-depth study of the mechanism causing the voltage sags. The first sensitivity type is called load voltage i/xii sensitivity and allows identifying which bus voltages are affected by a change in rotor angle of a particular generator. The second proposed type is called generator power sensitivity, which provides...... for early prediction of critical voltage sags is described. The method’s performance is compared to other prediction approaches. The results show that the proposed method succeeds in early, accurately and consistently predicting critically low voltage sags. An efficient on-line DSA not only identifies...

  5. Assessment of Environmental External Effects in Power Generation

    DEFF Research Database (Denmark)

    Meyer, Henrik Jacob; Morthorst, Poul Erik; Ibsen, Liselotte Schleisner

    1996-01-01

    technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared......This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetise the external effects in the production of energy, especially in relation to renewable energy...... the different ways of producing energy are identified, the stress caused by the effect is assessed, and finally the monetary value of the damage is estimated. The method is applied to the local as well as the regional and global externalities....

  6. Network Theory Integrated Life Cycle Assessment for an Electric Power System

    Directory of Open Access Journals (Sweden)

    Heetae Kim

    2015-08-01

    Full Text Available In this study, we allocate Greenhouse gas (GHG emissions of electricity transmission to the consumers. As an allocation basis, we introduce energy distance. Energy distance takes the transmission load on the electricity energy system into account in addition to the amount of electricity consumption. As a case study, we estimate regional GHG emissions of electricity transmission loss in Chile. Life cycle assessment (LCA is used to estimate the total GHG emissions of the Chilean electric power system. The regional GHG emission of transmission loss is calculated from the total GHG emissions. We construct the network model of Chilean electric power grid as an undirected network with 466 nodes and 543 edges holding the topology of the power grid based on the statistical record. We analyze the total annual GHG emissions of the Chilean electricity energy system as 23.07 Mt CO2-eq. and 1.61 Mt CO2-eq. for the transmission loss, respectively. The total energy distance for the electricity transmission accounts for 12,842.10 TWh km based on network analysis. We argue that when the GHG emission of electricity transmission loss is estimated, the electricity transmission load should be separately considered. We propose network theory as a useful complement to LCA analysis for the complex allocation. Energy distance is especially useful on a very large-scale electric power grid such as an intercontinental transmission network.

  7. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  8. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  9. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies

    Science.gov (United States)

    Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert

    2016-01-01

    The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471

  10. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies

    Science.gov (United States)

    Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert

    2016-11-01

    The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.

  11. Preliminary environmental assessment for the Satellite Power System (SPS). Revision 1. Volume 2. Detailed assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The Department of Energy (DOE) is considering several options for generating electrical power to meet future energy needs. The satellite power system (SPS), one of these options, would collect solar energy through a system of satellites in space and transfer this energy to earth. A reference system has been described that would convert the energy to microwaves and transmit the microwave energy via directive antennas to large receiving/rectifying antennas (rectennas) located on the earth. At the rectennas, the microwave energy would be converted into electricity. The potential environmental impacts of constructing and operating the satellite power system are being assessed as a part of the Department of Energy's SPS Concept Development and Evaluation Program. This report is Revision I of the Preliminary Environmental Assessment for the Satellite Power System published in October 1978. It refines and extends the 1978 assessment and provides a basis for a 1980 revision that will guide and support DOE recommendations regarding future SPS development. This is Volume 2 of two volumes. It contains the technical detail suitable for peer review and integrates information appearing in documents referenced herein. The key environmental issues associated with the SPS concern human health and safety, ecosystems, climate, and electromagnetic systems interactions. In order to address these issues in an organized manner, five tasks are reported: (I) microwave-radiation health and ecological effects; (II) nonmicrowave health and ecological effectss; (III) atmospheric effects; (IV) effects on communication systems due to ionospheric disturbance; and (V) electromagnetic compatibility. (WHK)

  12. Direct integration of intensity-level data from Affymetrix and Illumina microarrays improves statistical power for robust reanalysis

    Directory of Open Access Journals (Sweden)

    Turnbull Arran K

    2012-08-01

    Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.

  13. Statistical Analysis of Wind Power Density Based on the Weibull and Rayleigh Models of Selected Site in Malaysia

    Directory of Open Access Journals (Sweden)

    Aliashim Albani

    2014-02-01

    Full Text Available The demand for electricity in Malaysia is growing in tandem with its Gross Domestic Product (GDP growth. Malaysia is going to need even more energy as it strives to grow towards a high-income economy. Malaysia has taken steps to exploring the renewable energy (RE including wind energy as an alternative source for generating electricity. In the present study, the wind energy potential of the site is statistically analyzed based on 1-year measured hourly time-series wind speed data. Wind data were obtained from the Malaysian Meteorological Department (MMD weather stations at nine selected sites in Malaysia. The data were calculated by using the MATLAB programming to determine and generate the Weibull and Rayleigh distribution functions. Both Weibull and Rayleigh models are fitted and compared to the Field data probability distributions of year 2011. From the analysis, it was shown that the Weibull distribution is fitting the Field data better than the Rayleigh distribution for the whole year 2011. The wind power density of every site has been studied based on the Weibull and Rayleigh functions. The Weibull distribution shows a good approximation for estimation of wind power density in Malaysia.

  14. Probabilistic risk assessment framework for structural systems under multiple hazards using Bayesian statistics

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)

    2017-04-15

    Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.

  15. Development and statistical assessment of a paper-based immunoassay for detection of tumor markers

    Energy Technology Data Exchange (ETDEWEB)

    Mazzu-Nascimento, Thiago [Instituto de Química de São Carlos, Universidade de São Paulo, 13566-590, São Carlos, SP (Brazil); Instituto Nacional de Ciência e Tecnologia de Bioanalítica, Campinas, SP (Brazil); Morbioli, Giorgio Gianini [Instituto de Química de São Carlos, Universidade de São Paulo, 13566-590, São Carlos, SP (Brazil); Instituto Nacional de Ciência e Tecnologia de Bioanalítica, Campinas, SP (Brazil); School of Chemistry and Biochemistry, Georgia Institute of Technology, Atlanta, GA 30332 (United States); Milan, Luis Aparecido [Departamento de Estatística, Universidade Federal de São Carlos, São Carlos, SP (Brazil); Donofrio, Fabiana Cristina [Instituto de Ciências da Saúde, Universidade Federal de Mato Grosso, 78557-267, Sinop, MT (Brazil); Mestriner, Carlos Alberto [Wama Produtos para Laboratório Ltda, 13560-971, São Carlos, SP (Brazil); Carrilho, Emanuel, E-mail: emanuel@iqsc.usp.br [Instituto de Química de São Carlos, Universidade de São Paulo, 13566-590, São Carlos, SP (Brazil); Instituto Nacional de Ciência e Tecnologia de Bioanalítica, Campinas, SP (Brazil)

    2017-01-15

    Paper-based assays are an attractive low-cost option for clinical chemistry testing, due to characteristics such as short time of analysis, low consumption of samples and reagents, and high portability of assays. However, little attention has been given to the evaluation of the performance of these simple tests, which should include the use of a statistical approach to define the choice of best cut-off value for the test. The choice of the cut-off value impacts on the sensitivity and specificity of the bioassay. Here, we developed a paper-based immunoassay for the detection of the carcinoembryonic antigen (CEA) and performed a statistical assessment to establish the assay's cut-off value using the Youden's J index (68.28 A.U.), what allowed for a gain in sensibility (0.86) and specificity (1.0). We also discuss about the importance of defining a gray zone as a safety margin for test (±12% over the cut-off value), eliminating all false positives and false negatives outcomes and avoiding misleading results. The test accuracy was calculated as the area under the curve (AUC) of the receiver operating characteristic (ROC) curve, presenting a value of 0.97, what classifies this test as highly accurate. We propose here a low-cost method capable of detecting carcinoembryonic antigen (CEA) in human serum samples, highlighting the importance of statistical tools to evaluate a new low-cost diagnostic method. - Highlights: • A paper-based sandwich immunoassay protocol for detection of tumor markers. • A statistical approach to define cut-off values and measuring test's sensitivity, specificity and accuracy. • A simple way to create a gray zone, avoiding false positive and false negative outcomes.

  16. Assessment of the individual fracture risk of the proximal femur by using statistical appearance models.

    Science.gov (United States)

    Schuler, Benedikt; Fritscher, Karl D; Kuhn, Volker; Eckstein, Felix; Link, Thomas M; Schubert, Rainer

    2010-06-01

    Standard diagnostic techniques to quantify bone mineral density (BMD) include dual-energy x-ray absorptiometry (DXA) and quantitative computed tomography. However, BMD alone is not sufficient to predict the fracture risk for an individual patient. Therefore, the development of tools, which can assess the bone quality in order to predict individual biomechanics of a bone, would mean a significant improvement for the prevention of fragility fractures. In this study, a new approach to predict the fracture risk of proximal femora using a statistical appearance model will be presented. 100 CT data sets of human femur cadaver specimens are used to create statistical appearance models for the prediction of the individual fracture load (FL). Calculating these models offers the possibility to use information about the inner structure of the proximal femur, as well as geometric properties of the femoral bone for FL prediction. By applying principal component analysis, statistical models have been calculated in different regions of interest. For each of these models, the individual model parameters for each single data set were calculated and used as predictor variables in a multilinear regression model. By this means, the best working region of interest for the prediction of FL was identified. The accuracy of the FL prediction was evaluated by using a leave-one-out cross validation scheme. Performance of DXA in predicting FL was used as a standard of comparison. The results of the evaluative tests demonstrate that significantly better results for FL prediction can be achieved by using the proposed model-based approach (R = 0.91) than using DXA-BMD (R = 0.81) for the prediction of fracture load. The results of the evaluation show that the presented model-based approach is very promising and also comparable to studies that partly used higher image resolutions for bone quality assessment and fracture risk prediction.

  17. Assessment of Zero Power Critical Experiments and Needs for a Fission Surface Power System

    Energy Technology Data Exchange (ETDEWEB)

    Jim R Parry; John Darrell bess; Brad T. Rearden; Gary A. Harms

    2009-06-01

    The National Aeronautics and Space Administration (NASA) is providing funding to the Department of Energy (DOE) to assess, develop, and test nuclear technologies that could provide surface power to a lunar outpost. Sufficient testing of this fission surface power (FSP) system will need to be completed to enable a decision by NASA for flight development. The near-term goal for the FSP work is to conduct the minimum amount of testing needed to validate the system performance within an acceptable risk. This report attempts to assess the current modeling capabilities and quantify any bias associated with the modeling methods for designing the nuclear reactor. The baseline FSP system is a sodium-potassium (NaK) cooled, fast spectrum reactor with 93% 235U enriched HEU-O2 fuel, SS316 cladding, and beryllium reflectors with B4C control drums. The FSP is to produce approximately 40 kWe net power with a lifetime of at least 8 years at full power. A flight-ready FSP is to be ready for launch and deployment by 2020. Existing benchmarks from the International Criticality Safety Benchmark Evaluation Program (ICSBEP) were reviewed and modeled in MCNP. An average bias of less than 0.6% was determined using the ENDF/B-VII cross-section libraries except in the case of subcritical experiments, which exhibited an average bias of approximately 1.5%. The bias increases with increasing reflector worth of the beryllium. The uncertainties and sensitivities in cross section data for the FSP model and ZPPR-20 configurations were assessed using TSUNAMI-3D. The cross-section covariance uncertainty in the FSP model was calculated as 2.09%, which was dominated by the uncertainty in the 235U(n,?) reactions. Global integral indices were generated in TSUNAMI-IP using pre-release SCALE 6 cross-section covariance data. The ZPPR-20 benchmark models exhibit strong similarity with the FSP model. A penalty assessment was performed to determine the degree of which the FSP model could not be characterized

  18. A statistical assessment of population trends for data deficient Mexican amphibians

    Directory of Open Access Journals (Sweden)

    Esther Quintero

    2014-12-01

    Full Text Available Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats.Methods. We harvested data from the Encyclopedia of Life (EOL and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions.Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  19. A statistical assessment of population trends for data deficient Mexican amphibians.

    Science.gov (United States)

    Quintero, Esther; Thessen, Anne E; Arias-Caballero, Paulina; Ayala-Orozco, Bárbara

    2014-01-01

    Background. Mexico has the world's fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species' risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL) and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  20. Independent Orbiter Assessment (IOA): Assessment of the electrical power generation/power reactant storage and distribution subsystem FMEA/CIL

    Science.gov (United States)

    Ames, B. E.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA effort first completed an analysis of the Electrical Power Generation/Power Reactant Storage and Distribution (EPG/PRSD) subsystem hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baselines with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison are documented for the Orbiter EPG/PRSD hardware. The comparison produced agreement on all but 27 FMEAs and 9 CIL items. The discrepancy between the number of IOA findings and NASA FMEAs can be partially explained by the different approaches used by IOA and NASA to group failure modes together to form one FMEA. Also, several IOA items represented inner tank components and ground operations failure modes which were not in the NASA baseline.

  1. Assessment of metals bioavailability to vegetables under field conditions using DGT, single extractions and multivariate statistics

    Directory of Open Access Journals (Sweden)

    Senila Marin

    2012-10-01

    Full Text Available Abstract Background The metals bioavailability in soils is commonly assessed by chemical extractions; however a generally accepted method is not yet established. In this study, the effectiveness of Diffusive Gradients in Thin-films (DGT technique and single extractions in the assessment of metals bioaccumulation in vegetables, and the influence of soil parameters on phytoavailability were evaluated using multivariate statistics. Soil and plants grown in vegetable gardens from mining-affected rural areas, NW Romania, were collected and analysed. Results Pseudo-total metal content of Cu, Zn and Cd in soil ranged between 17.3-146 mg kg-1, 141–833 mg kg-1 and 0.15-2.05 mg kg-1, respectively, showing enriched contents of these elements. High degrees of metals extractability in 1M HCl and even in 1M NH4Cl were observed. Despite the relatively high total metal concentrations in soil, those found in vegetables were comparable to values typically reported for agricultural crops, probably due to the low concentrations of metals in soil solution (Csoln and low effective concentrations (CE, assessed by DGT technique. Among the analysed vegetables, the highest metal concentrations were found in carrots roots. By applying multivariate statistics, it was found that CE, Csoln and extraction in 1M NH4Cl, were better predictors for metals bioavailability than the acid extractions applied in this study. Copper transfer to vegetables was strongly influenced by soil organic carbon (OC and cation exchange capacity (CEC, while pH had a higher influence on Cd transfer from soil to plants. Conclusions The results showed that DGT can be used for general evaluation of the risks associated to soil contamination with Cu, Zn and Cd in field conditions. Although quantitative information on metals transfer from soil to vegetables was not observed.

  2. Groundwater vulnerability assessment: from overlay methods to statistical methods in the Lombardy Plain area

    Directory of Open Access Journals (Sweden)

    Stefania Stevenazzi

    2017-06-01

    Full Text Available Groundwater is among the most important freshwater resources. Worldwide, aquifers are experiencing an increasing threat of pollution from urbanization, industrial development, agricultural activities and mining enterprise. Thus, practical actions, strategies and solutions to protect groundwater from these anthropogenic sources are widely required. The most efficient tool, which helps supporting land use planning, while protecting groundwater from contamination, is represented by groundwater vulnerability assessment. Over the years, several methods assessing groundwater vulnerability have been developed: overlay and index methods, statistical and process-based methods. All methods are means to synthesize complex hydrogeological information into a unique document, which is a groundwater vulnerability map, useable by planners, decision and policy makers, geoscientists and the public. Although it is not possible to identify an approach which could be the best one for all situations, the final product should always be scientific defensible, meaningful and reliable. Nevertheless, various methods may produce very different results at any given site. Thus, reasons for similarities and differences need to be deeply investigated. This study demonstrates the reliability and flexibility of a spatial statistical method to assess groundwater vulnerability to contamination at a regional scale. The Lombardy Plain case study is particularly interesting for its long history of groundwater monitoring (quality and quantity, availability of hydrogeological data, and combined presence of various anthropogenic sources of contamination. Recent updates of the regional water protection plan have raised the necessity of realizing more flexible, reliable and accurate groundwater vulnerability maps. A comparison of groundwater vulnerability maps obtained through different approaches and developed in a time span of several years has demonstrated the relevance of the

  3. Assessing socioeconomic vulnerability to dengue fever in Cali, Colombia: statistical vs expert-based modeling

    Science.gov (United States)

    2013-01-01

    Background As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Methods Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. Results The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., < 15 years) and illiterate residents, as well as a high proportion of individuals being either unemployed or doing housework. Conclusions Both modeling approaches reveal similar outputs, indicating that in the absence of local expertise, statistical approaches could be used, with caution. By decomposing identified

  4. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    Science.gov (United States)

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially

  5. Assessing a learning process with functional ANOVA estimators of EEG power spectral densities.

    Science.gov (United States)

    Gutiérrez, David; Ramírez-Moreno, Mauricio A

    2016-04-01

    We propose to assess the process of learning a task using electroencephalographic (EEG) measurements. In particular, we quantify changes in brain activity associated to the progression of the learning experience through the functional analysis-of-variances (FANOVA) estimators of the EEG power spectral density (PSD). Such functional estimators provide a sense of the effect of training in the EEG dynamics. For that purpose, we implemented an experiment to monitor the process of learning to type using the Colemak keyboard layout during a twelve-lessons training. Hence, our aim is to identify statistically significant changes in PSD of various EEG rhythms at different stages and difficulty levels of the learning process. Those changes are taken into account only when a probabilistic measure of the cognitive state ensures the high engagement of the volunteer to the training. Based on this, a series of statistical tests are performed in order to determine the personalized frequencies and sensors at which changes in PSD occur, then the FANOVA estimates are computed and analyzed. Our experimental results showed a significant decrease in the power of [Formula: see text] and [Formula: see text] rhythms for ten volunteers during the learning process, and such decrease happens regardless of the difficulty of the lesson. These results are in agreement with previous reports of changes in PSD being associated to feature binding and memory encoding.

  6. The Theory of Separation of Powers in Nigeria: An Assessment ...

    African Journals Online (AJOL)

    The Theory of Separation of Powers means that, a different body of persons is to administer each of the three departments of government. That no one of them is to have a controlling power over either of the others. For the purpose of preserving the liberty of the individual and for avoiding tyranny separation of powers is ...

  7. Assessing power grid reliability using rare event simulation

    NARCIS (Netherlands)

    Wadman, W.S.

    2015-01-01

    Renewable energy generators such as wind turbines and solar panels supply more and more power in modern electrical grids. Although the transition to a sustainable power supply is desirable, considerable implementation of distributed and intermittent generators may strain the power grid. Since grid

  8. The Theory of Separation of Powers in Nigeria: An Assessment

    African Journals Online (AJOL)

    gold

    2012-07-26

    Jul 26, 2012 ... John Locke argued that legislative and executive powers were conceptually different. But that it was necessary to separate them in government institutions. However, in Locke's conception, judicial power played no significant role. The modern idea of the doctrine of separation of powers was vigorously.

  9. Assessing landslide susceptibility by statistical data analysis and GIS: the case of Daunia (Apulian Apennines, Italy)

    Science.gov (United States)

    Ceppi, C.; Mancini, F.; Ritrovato, G.

    2009-04-01

    This study aim at the landslide susceptibility mapping within an area of the Daunia (Apulian Apennines, Italy) by a multivariate statistical method and data manipulation in a Geographical Information System (GIS) environment. Among the variety of existing statistical data analysis techniques, the logistic regression was chosen to produce a susceptibility map all over an area where small settlements are historically threatened by landslide phenomena. By logistic regression a best fitting between the presence or absence of landslide (dependent variable) and the set of independent variables is performed on the basis of a maximum likelihood criterion, bringing to the estimation of regression coefficients. The reliability of such analysis is therefore due to the ability to quantify the proneness to landslide occurrences by the probability level produced by the analysis. The inventory of dependent and independent variables were managed in a GIS, where geometric properties and attributes have been translated into raster cells in order to proceed with the logistic regression by means of SPSS (Statistical Package for the Social Sciences) package. A landslide inventory was used to produce the bivariate dependent variable whereas the independent set of variable concerned with slope, aspect, elevation, curvature, drained area, lithology and land use after their reductions to dummy variables. The effect of independent parameters on landslide occurrence was assessed by the corresponding coefficient in the logistic regression function, highlighting a major role played by the land use variable in determining occurrence and distribution of phenomena. Once the outcomes of the logistic regression are determined, data are re-introduced in the GIS to produce a map reporting the proneness to landslide as predicted level of probability. As validation of results and regression model a cell-by-cell comparison between the susceptibility map and the initial inventory of landslide events was

  10. Targeting change: Assessing a faculty learning community focused on increasing statistics content in life science curricula.

    Science.gov (United States)

    Parker, Loran Carleton; Gleichsner, Alyssa M; Adedokun, Omolola A; Forney, James

    2016-11-12

    Transformation of research in all biological fields necessitates the design, analysis and, interpretation of large data sets. Preparing students with the requisite skills in experimental design, statistical analysis, and interpretation, and mathematical reasoning will require both curricular reform and faculty who are willing and able to integrate mathematical and statistical concepts into their life science courses. A new Faculty Learning Community (FLC) was constituted each year for four years to assist in the transformation of the life sciences curriculum and faculty at a large, Midwestern research university. Participants were interviewed after participation and surveyed before and after participation to assess the impact of the FLC on their attitudes toward teaching, perceived pedagogical skills, and planned teaching practice. Overall, the FLC had a meaningful positive impact on participants' attitudes toward teaching, knowledge about teaching, and perceived pedagogical skills. Interestingly, confidence for viewing the classroom as a site for research about teaching declined. Implications for the creation and development of FLCs for science faculty are discussed. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(6):517-525, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  11. Global benefit-risk assessment in designing clinical trials and some statistical considerations of the method.

    Science.gov (United States)

    Pritchett, Yili Lu; Tamura, Roy

    2008-01-01

    When characterizing a therapy, the efficacy and the safety are two major aspects under consideration. In prescribing a therapy to a patient, a clinician puts the two aspects together and makes a decision based on a consolidated thought process. The global benefit-risk (GBR) measures proposed by Chuang-Stein et al. (Stat. Med. 1991; 10:1349-1359) are useful in facilitating the thinking, and creating the framework for making statistical comparisons based on benefit-risk point of view. This article describes how a GBR linear score was defined and used as the primary outcome measure in a clinical trial design. The robustness of the definitions of 'benefit' and 'risk' are evaluated using different criteria. The sensitivity of the pre-specified weights is also analyzed using alternative weights; one of those was determined by the relative to an identified distribution integral transformation approach (Biometrics 1958; 14:18-38). Statistical considerations are illustrated using pooled data from clinical trials studying antidepressant. The pros and cons for using GBR assessments in the setting of clinical trials are discussed.

  12. Proper Assessment of the JFK Assassination Bullet Lead Evidence from Metallurgical and Statistical Perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Randich, E; Grant, P M

    2006-08-29

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano, 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in Mannlicher-Carcano bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  13. Proper assessment of the JFK assassination bullet lead evidence from metallurgical and statistical perspectives.

    Science.gov (United States)

    Randich, Erik; Grant, Patrick M

    2006-07-01

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano (MC), 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in MC bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  14. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    Science.gov (United States)

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  15. Sea cliff instability susceptibility at regional scale: A statistically based assessment in southern Algarve, Portugal.

    Science.gov (United States)

    Marques, F.; Matildes, R.; Redweik, P.

    2012-04-01

    Mass movements are the dominant process of sea cliff evolution, being a considerable source of natural hazard and a significant constrain for human activities in coastal areas. Related hazards include cliff top retreat, with implications on planning and land management, and unstable soil or rock movements at the cliffs face and toe, with implications mainly on beach users and support structures. To assess the spatial component of sea cliff hazard assessment with implications on planning, i.e. the susceptibility of a given cliff section to be affected by instabilities causing retreat of the cliff top, a statistically based study was carried out along the top of the sea cliffs of Burgau-Lagos coastal section (Southwest Algarve, Portugal). The study was based on bivariate and multi-variate statistics applied to a set of predisposing factors, mainly related with geology and geomorphology, which were correlated with an inventory of past cliff failures. The multi-temporal inventory of past cliff failures was produced using aerial digital photogrammetric methods, which included special procedures to enable the extraction of accurate data from old aerial photos, and validated by systematic stereo photo interpretation, helped by oblique aerial photos and field surveys. This study identified 137 cliff failures occurred between 1947 and 2007 along the 13 km long cliffs, causing the loss of 10,234 m2 of horizontal area at the cliffs top. The cliff failures correspond to planar slides (58%) mainly in Cretaceous alternating limestone and marls, toppling failures (17%) mainly in Miocene calcarenites, slumps (15%) in Plio-pleistocene silty sands that infill the karst in the Miocene rocks, and the remaining 10% correspond to complex movements, rockfalls and not determined cases. The space distribution of cliff failures is quite irregular but enables the objective separation of sub sections with homogeneous retreat behavior, for which were computed mean retreat rates between 5x10-3m

  16. The price of electricity from private power producers: Stage 2, Expansion of sample and preliminary statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Comnes, G.A.; Belden, T.N.; Kahn, E.P.

    1995-02-01

    The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.

  17. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  18. A global assessment of civil registration and vital statistics systems: monitoring data quality and progress.

    Science.gov (United States)

    Mikkelsen, Lene; Phillips, David E; AbouZahr, Carla; Setel, Philip W; de Savigny, Don; Lozano, Rafael; Lopez, Alan D

    2015-10-03

    Increasing demand for better quality data and more investment to strengthen civil registration and vital statistics (CRVS) systems will require increased emphasis on objective, comparable, cost-effective monitoring and assessment methods to measure progress. We apply a composite index (the vital statistics performance index [VSPI]) to assess the performance of CRVS systems in 148 countries or territories during 1980-2012 and classify them into five distinct performance categories, ranging from rudimentary (with scores close to zero) to satisfactory (with scores close to one), with a mean VSPI score since 2005 of 0·61 (SD 0·31). As expected, the best performing systems were mostly in the European region, the Americas, and Australasia, with only two countries from east Asia and Latin America. Most low-scoring countries were in the African or Asian regions. Globally, only modest progress has been made since 2000, with the percentage of deaths registered increasing from 36% to 38%, and the percentage of children aged under 5 years whose birth has been registered increasing from 58% to 65%. However, several individual countries have made substantial improvements to their CRVS systems in the past 30 years by capturing more deaths and improving accuracy of cause-of-death information. Future monitoring of the effects of CRVS strengthening will greatly benefit from application of a metric like the VSPI, which is objective, costless to compute, and able to identify components of the system that make the largest contributions to good or poor performance. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Preliminary assessment of alternative PFBC power plant systems

    Science.gov (United States)

    Wysocki, J.; Rogali, R.

    1980-07-01

    Design and economic comparisons of the following nominal 1000 MWe pressurized fluidized bed combustion (PFBC) power plants are presented for both eastern and western coal: Curtiss-Wright PFBC power plants with an air-cooled design; General Electric RFBC power plants with a steam-cooled design; and AEP/Stal-Laval PFBC power plants with a steam-cooled design. In addition, reference pulverized coal-fired (PCF) power plants are included for comparison purposes. The results of the analysis indicate: (1) the steam-cooled PFBC designs show potential savings of 10% and 11% over PCF plants for eastern and western coal, respectively, in terms of busbar power cost; (2) the air-cooled PFBC designs show potential savings of 1% and 2% over PCF plants for eastern and western coal, respectively, in terms of busbar power cost.

  20. Statistical and Measurement Properties of Features Used in Essay Assessment. Research Report. ETS RR-04-21

    Science.gov (United States)

    Haberman, Shelby J.

    2004-01-01

    Statistical and measurement properties are examined for features used in essay assessment to determine the generalizability of the features across populations, prompts, and individuals. Data are employed from TOEFL® and GMAT® examinations and from writing for Criterion?.

  1. Statistical modeling of complex health outcomes and air pollution data: Application of air quality health indexing for asthma risk assessment

    Directory of Open Access Journals (Sweden)

    Swarna Weerasinghe

    2017-03-01

    Conclusion:  This study demonstrated the importance of complex statistical model use and the consequences of lack of such modelling that accounted for data structures in public health risk assessments.

  2. Discrimination power of short-term heart rate variability measures for CHF assessment.

    Science.gov (United States)

    Pecchia, Leandro; Melillo, Paolo; Sansone, Mario; Bracale, Marcello

    2011-01-01

    In this study, we investigated the discrimination power of short-term heart rate variability (HRV) for discriminating normal subjects versus chronic heart failure (CHF) patients. We analyzed 1914.40 h of ECG of 83 patients of which 54 are normal and 29 are suffering from CHF with New York Heart Association (NYHA) classification I, II, and III, extracted by public databases. Following guidelines, we performed time and frequency analysis in order to measure HRV features. To assess the discrimination power of HRV features, we designed a classifier based on the classification and regression tree (CART) method, which is a nonparametric statistical technique, strongly effective on nonnormal medical data mining. The best subset of features for subject classification includes square root of the mean of the sum of the squares of differences between adjacent NN intervals (RMSSD), total power, high-frequencies power, and the ratio between low- and high-frequencies power (LF/HF). The classifier we developed achieved sensitivity and specificity values of 79.3 % and 100 %, respectively. Moreover, we demonstrated that it is possible to achieve sensitivity and specificity of 89.7 % and 100 %, respectively, by introducing two nonstandard features ΔAVNN and ΔLF/HF, which account, respectively, for variation over the 24 h of the average of consecutive normal intervals (AVNN) and LF/HF. Our results are comparable with other similar studies, but the method we used is particularly valuable because it allows a fully human-understandable description of classification procedures, in terms of intelligible "if … then …" rules.

  3. A Meteorological Information Mining-Based Wind Speed Model for Adequacy Assessment of Power Systems With Wind Power

    DEFF Research Database (Denmark)

    Guo, Yifei; Gao, Houlei; Wu, Qiuwei

    2017-01-01

    Accurate wind speed simulation is an essential prerequisite to analyze the power systems with wind power. A wind speed model considering meteorological conditions and seasonal variations is proposed in this paper. Firstly, using the path analysis method, the influence weights of meteorological...... in capturing the characteristics of probability distribution, auto-correlation and seasonal variations of wind speed compared with the traditional Markov chain Monte Carlo (MCMC) and autoregressive moving average (ARMA) model. Furthermore, the proposed model was applied to adequacy assessment of generation...... systems with wind power. The assessment results of the modified IEEE-RTS79 and IEEE-RTS96 demonstrated the effectiveness and accuracy of the proposed model....

  4. Power to detect trends in Missouri River fish populations within the Habitat Assessment Monitoring Program

    Science.gov (United States)

    Bryan, Janice L.; Wildhaber, Mark L.; Gladish, Dan W.

    2010-01-01

    As with all large rivers in the United States, the Missouri River has been altered, with approximately one-third of the mainstem length impounded and one-third channelized. These physical alterations to the environment have affected the fish populations, but studies examining the effects of alterations have been localized and for short periods of time, thereby preventing generalization. In response to the U.S. Fish and Wildlife Service Biological Opinion, the U.S. Army Corps of Engineers (USACE) initiated monitoring of habitat improvements of the Missouri River in 2005. The goal of the Habitat Assessment Monitoring Program (HAMP) is to provide information on the response of target fish species to the USACE habitat creation on the Lower Missouri River. To determine the statistical power of the HAMP and in cooperation with USACE, a power analysis was conducted using a normal linear mixed model with variance component estimates based on the first complete year of data. At a level of 20/16 (20 bends with 16 subsamples in each bend), at least one species/month/gear model has the power to determine differences between treated and untreated bends. The trammel net in September had the most species models with adequate power at the 20/16 level and overall, the trammel net had the most species/month models with adequate power at the 20/16 level. However, using only one gear or gear/month combination would eliminate other species of interest, such as three chub species (Macrhybopsis meeki, Macrhybopsis aestivalis, and Macrhybopsis gelida), sand shiners (Notropis stramineus), pallid sturgeon (Scaphirhynchus albus), and juvenile sauger (Sander canadensis). Since gear types are selective in their species efficiency, the strength of the HAMP approach is using multiple gears that have statistical power to differentiate habitat treatment differences in different fish species within the Missouri River. As is often the case with sampling rare species like the pallid sturgeon, the

  5. Wind power in Eritrea, Africa: A preliminary resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, K.; Rosen, K. [San Jose State Univ., CA (United States); Van Buskirk, R. [Dept. of Energy, Eritrea (Ethiopia)

    1997-12-31

    The authors preliminary assessment of Eritrean wind energy potential identified two promising regions: (1) the southeastern Red Sea coast and (2) the mountain passes that channel winds between the coastal lowlands and the interior highlands. The coastal site, near the port city of Aseb, has an exceptionally good resource, with estimated average annual wind speeds at 10-m height above 9 m/s at the airport and 7 m/s in the port. Furthermore, the southern 200 km of coastline has offshore WS{sub aa} > 6 m/s. This area has strong potential for development, having a local 20 MW grid and unmet demand for the fishing industry and development. Although the highland sites contain only marginal wind resources ({approximately} 5 m/s), they warrant further investigation because of their proximity to the capital city, Asmera, which has the largest unmet demand and a larger power grid (40 MW with an additional 80 MW planned) to absorb an intermittent source without storage.

  6. Universal blind image quality assessment metrics via natural scene statistics and multiple kernel learning.

    Science.gov (United States)

    Gao, Xinbo; Gao, Fei; Tao, Dacheng; Li, Xuelong

    2013-12-01

    Universal blind image quality assessment (IQA) metrics that can work for various distortions are of great importance for image processing systems, because neither ground truths are available nor the distortion types are aware all the time in practice. Existing state-of-the-art universal blind IQA algorithms are developed based on natural scene statistics (NSS). Although NSS-based metrics obtained promising performance, they have some limitations: 1) they use either the Gaussian scale mixture model or generalized Gaussian density to predict the nonGaussian marginal distribution of wavelet, Gabor, or discrete cosine transform coefficients. The prediction error makes the extracted features unable to reflect the change in nonGaussianity (NG) accurately. The existing algorithms use the joint statistical model and structural similarity to model the local dependency (LD). Although this LD essentially encodes the information redundancy in natural images, these models do not use information divergence to measure the LD. Although the exponential decay characteristic (EDC) represents the property of natural images that large/small wavelet coefficient magnitudes tend to be persistent across scales, which is highly correlated with image degradations, it has not been applied to the universal blind IQA metrics; and 2) all the universal blind IQA metrics use the same similarity measure for different features for learning the universal blind IQA metrics, though these features have different properties. To address the aforementioned problems, we propose to construct new universal blind quality indicators using all the three types of NSS, i.e., the NG, LD, and EDC, and incorporating the heterogeneous property of multiple kernel learning (MKL). By analyzing how different distortions affect these statistical properties, we present two universal blind quality assessment models, NSS global scheme and NSS two-step scheme. In the proposed metrics: 1) we exploit the NG of natural images

  7. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Som, Seu S.; Haindl, Walter [Department of Nuclear Medicine, Prince of Wales Hospital, Randwick, New South Wales (Australia); Bye, Ann M.E. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Department of Neurology, Sydney Children' s Hospital, High Street, 2031, Randwick, NSW (Australia)

    2004-03-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8{+-}4.3 years; mean{+-}SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the

  8. 75 FR 2164 - Entergy Nuclear Operations, Inc.; Pilgrim Nuclear Power Station; Environmental Assessment and...

    Science.gov (United States)

    2010-01-14

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Entergy Nuclear Operations, Inc.; Pilgrim Nuclear Power Station; Environmental Assessment and...), for operation of Pilgrim Nuclear Power Station (Pilgrim), located in Plymouth County, MA. Therefore...

  9. Accuracy of genome-wide imputation of untyped markers and impacts on statistical power for association studies

    Directory of Open Access Journals (Sweden)

    McElwee Joshua

    2009-06-01

    -eQTL discoveries detected by various methods can be interpreted as their relative statistical power in the GWAS. In this study, we find that imputation offer modest additional power (by 4% on top of either Ilmn317K or Ilmn650Y, much less than the power gain from Ilmn317K to Ilmn650Y (13%. Conclusion Current algorithms can accurately impute genotypes for untyped markers, which enables researchers to pool data between studies conducted using different SNP sets. While genotyping itself results in a small error rate (e.g. 0.5%, imputing genotypes is surprisingly accurate. We found that dense marker sets (e.g. Ilmn650Y outperform sparser ones (e.g. Ilmn317K in terms of imputation yield and accuracy. We also noticed it was harder to impute genotypes for African American samples, partially due to population admixture, although using a pooled reference boosts performance. Interestingly, GWAS carried out using imputed genotypes only slightly increased power on top of assayed SNPs. The reason is likely due to adding more markers via imputation only results in modest gain in genetic coverage, but worsens the multiple testing penalties. Furthermore, cis-eQTL mapping using dense SNP set derived from imputation achieves great resolution, and locate associate peak closer to causal variants than conventional approach.

  10. Statistical analysis about corrosion in nuclear power plants; Analisis estadistico de la corrosion en centrales nucleares de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Naquid G, C.; Medina F, A.; Zamora R, L. [Instituto Nacional de Investigaciones Nucleares, Gerencia de Ciencia de Materiales, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  11. Can we successfully monitor a population density decline of elusive invertebrates? A statistical power analysis on Lucanus cervus

    Directory of Open Access Journals (Sweden)

    Arno Thomaes

    2017-07-01

    Full Text Available Monitoring global biodiversity is essential for understanding and countering its current loss. However, monitoring of many species is hindered by their difficult detection due to crepuscular activity, hidden phases of the life cycle, short activity period and low population density. Few statistical power analyses of declining trends have been published for terrestrial invertebrates. Consequently, no knowledge exists of the success rate of monitoring elusive invertebrates. Here data from monitoring transects of the European stag beetle, Lucanus cervus, is used to investigate whether the population trend of this elusive species can be adequately monitored. Data from studies in UK, Switzerland and Germany were compiled to parameterize a simulation model explaining the stag beetle abundance as a function of temperature and seasonality. A Monte-Carlo simulation was used to evaluate the effort needed to detect a population abundance decline of 1%/year over a period of 12 years. To reveal such a decline, at least 240 1-hour transect walks on 40 to 100 transects need to be implemented in weekly intervals during warm evenings. It is concluded that monitoring of stag beetles is feasible and the effort is not greater than that which has been found for other invertebrates. Based on this example, it is assumed that many other elusive species with similar life history traits can be monitored with moderate efforts. As saproxylic invertebrates account for a large share of the forest biodiversity, although many are elusive, it is proposed that at least some flagship species are included in monitoring programmes.

  12. Constructing mathematical models for simulating the technological processes in thermal power equipment on the basis of statistical approximation methods

    Science.gov (United States)

    Kolchev, K. K.; Mezin, S. V.

    2015-07-01

    A technique for constructing mathematical models simulating the technological processes in thermal power equipment developed on the basis of the statistical approximation method is described. The considered method was used in the developed software module (plug-in) intended for calculating nonlinear mathematical models of gas turbine units and for diagnosing them. The mathematical models constructed using this module are used for describing the current state of a system. Deviations of the system's actual state from the estimate obtained using the mathematical model point to malfunctions in operation of this system. The multidimensional interpolation and approximation method and the theory of random functions serve as a theoretical basis of the developed technique. By using the developed technique it is possible to construct complex static models of plants that are subject to control and diagnostics. The module developed using the proposed technique makes it possible to carry out periodic diagnostics of the operating equipment for revealing deviations from the normal mode of its operation. The specific features relating to construction of mathematical models are considered, and examples of applying them with the use of observations obtained on the equipment of gas turbine units are given.

  13. Sample-Size Planning for More Accurate Statistical Power: A Method Adjusting Sample Effect Sizes for Publication Bias and Uncertainty.

    Science.gov (United States)

    Anderson, Samantha F; Kelley, Ken; Maxwell, Scott E

    2017-11-01

    The sample size necessary to obtain a desired level of statistical power depends in part on the population value of the effect size, which is, by definition, unknown. A common approach to sample-size planning uses the sample effect size from a prior study as an estimate of the population value of the effect to be detected in the future study. Although this strategy is intuitively appealing, effect-size estimates, taken at face value, are typically not accurate estimates of the population effect size because of publication bias and uncertainty. We show that the use of this approach often results in underpowered studies, sometimes to an alarming degree. We present an alternative approach that adjusts sample effect sizes for bias and uncertainty, and we demonstrate its effectiveness for several experimental designs. Furthermore, we discuss an open-source R package, BUCSS, and user-friendly Web applications that we have made available to researchers so that they can easily implement our suggested methods.

  14. NO-REFERENCE IMAGE QUALITY ASSESSMENT FOR ZY3 IMAGERY IN URBAN AREAS USING STATISTICAL MODEL

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2016-06-01

    Full Text Available More and more high-spatial resolution satellite images are produced with the improvement of satellite technology. However, the quality of images is not always satisfactory for application. Due to the impact of complicated atmospheric conditions and complex radiation transmission process in imaging process the images often suffer deterioration. In order to assess the quality of remote sensing images over urban areas, we proposed a general purpose image quality assessment methods based on feature extraction and machine learning. We use two types of features in multi scales. One is from the shape of histogram the other is from the natural scene statistics based on Generalized Gaussian distribution (GGD. A 20-D feature vector for each scale is extracted and is assumed to capture the RS image quality degradation characteristics. We use SVM to learn to predict image quality scores from these features. In order to do the evaluation, we construct a median scale dataset for training and testing with subjects taking part in to give the human opinions of degraded images. We use ZY3 satellite images over Wuhan area (a city in China to conduct experiments. Experimental results show the correlation of the predicted scores and the subjective perceptions.

  15. Assessing the statistical robustness of inter- and intra-basinal carbon isotope chemostratigraphic correlation

    Science.gov (United States)

    Hay, C.; Creveling, J. R.; Huybers, P. J.

    2016-12-01

    Excursions in the stable carbon isotopic composition of carbonate rocks (δ13Ccarb) can facilitate correlation of Precambrian and Phanerozoic sedimentary successions at a higher temporal resolution than radiometric and biostratigraphic frameworks typically afford. Within the bounds of litho- and biostratigraphic constraints, stratigraphers often correlate isotopic patterns between distant stratigraphic sections through visual alignment of local maxima and minima of isotopic values. The reproducibility of this method can prove challenging and, thus, evaluating the statistical robustness of intrabasinal composite carbon isotope curves, and global correlations to these reference curves, remains difficult. To assess the reproducibility of stratigraphic alignment of δ13Ccarb data, and correlations between carbon isotope excursions, we employ a numerical dynamic time warping methodology that stretches and squeezes the time axis of a record to obtain an optimal correlation (in a least-squares sense) between time-uncertain series of data. In particular, we assess various alignments between series of Early Cambrian δ13Ccarb data with respect to plausible matches. We first show that an alignment of these records obtained visually, and published previously, is broadly reproducible using dynamic time warping. Alternative alignments with similar goodness of fits are also obtainable, and their stratigraphic plausibility are discussed. This approach should be generalizable to an algorithm for the purposes of developing a library of plausible alignments between multiple time-uncertain stratigraphic records.

  16. The use of test scores from large-scale assessment surveys: psychometric and statistical considerations

    Directory of Open Access Journals (Sweden)

    Henry Braun

    2017-11-01

    Full Text Available Abstract Background Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT or ACT. These differences have important implications both for utilization and interpretation. Although much has been written about PVs, it appears that there are still misconceptions about whether and how to employ them in secondary analyses. Methods We address a range of technical issues, including those raised in a recent article that was written to inform economists using these databases. First, an extensive review of the relevant literature was conducted, with particular attention to key publications that describe the derivation and psychometric characteristics of such achievement measures. Second, a simulation study was carried out to compare the statistical properties of estimates based on the use of PVs with those based on other, commonly used methods. Results It is shown, through both theoretical analysis and simulation, that under fairly general conditions appropriate use of PV yields approximately unbiased estimates of model parameters in regression analyses of large scale survey data. The superiority of the PV methodology is particularly evident when measures of student achievement are employed as explanatory variables. Conclusions The PV methodology used to report student test performance in large scale surveys remains the state-of-the-art for secondary analyses of these databases.

  17. Measuring classroom management expertise (CME of teachers: A video-based assessment approach and statistical results

    Directory of Open Access Journals (Sweden)

    Johannes König

    2015-12-01

    Full Text Available The study aims at developing and exploring a novel video-based assessment that captures classroom management expertise (CME of teachers and for which statistical results are provided. CME measurement is conceptualized by using four video clips that refer to typical classroom management situations in which teachers are heavily challenged (involving the challenges to manage transitions, instructional time, student behavior, and instructional feedback and by applying three cognitive demands posed on respondents when responding to test items related to the video clips (accuracy of perception, holistic perception, and justification of action. Research questions are raised regarding reliability, testlet effects (related to the four video clips applied for measurement, intercorrelations of cognitive demands, and criterion-related validity of the instrument. Evidence is provided that (1 using a video-based assessment CME can be measured in a reliable way, (2 the CME total score represents a general ability that is only slightly influenced by testlet effects related to the four video clips, (3 the three cognitive demands conceptualized for the measurement of CME are highly intercorrelated, and (4 the CME measure is positively correlated with declarative-conceptual general pedagogical knowledge (medium effect size, whereas it shows only small size correlations with non-cognitive teacher variables.

  18. Using a statistical process control chart during the quality assessment of cancer registry data.

    Science.gov (United States)

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  19. Power supply risk assessment method for relay protection system faults

    Directory of Open Access Journals (Sweden)

    Tuyou Si

    2016-12-01

    Full Text Available The influence and the potential risk due to hidden faults of a relay protection system on power supply in distribution systems are paid more and more attention to. A probability analysis method is used to analyse fault characteristics and action mechanism of dominant faults, hidden misoperation and non-operation of the relay protection systems, and failure probability model of relay protection system is constructed and simplified. The effects of dominant faults, hidden misoperation and non-operation of the relay protection systems on the reduced power supply load power are analysed, and a probabilistic model for reduced power supply load power is constructed by three parts corresponding to dominant faults, hidden misoperation and non-operation. A probability calculation method of power supply risk occurrence due to hidden faults of relay protecttion system is proposed considering the fault probability of the relay protection systems, the frequency of the hidden faults occurring in operation period, the reduced power supply load power or load power outage, and the connection mode of the in-lines, out-lines and transformers in a substation. The feasibility and applicability of the proposed method for estimation of risk value probability of the relay protection systems is verified by two studied examples.

  20. Improving effect size estimation and statistical power with multi-echo fMRI and its impact on understanding the neural systems supporting mentalizing.

    Science.gov (United States)

    Lombardo, Michael V; Auyeung, Bonnie; Holt, Rosemary J; Waldman, Jack; Ruigrok, Amber N V; Mooney, Natasha; Bullmore, Edward T; Baron-Cohen, Simon; Kundu, Prantik

    2016-11-15

    Functional magnetic resonance imaging (fMRI) research is routinely criticized for being statistically underpowered due to characteristically small sample sizes and much larger sample sizes are being increasingly recommended. Additionally, various sources of artifact inherent in fMRI data can have detrimental impact on effect size estimates and statistical power. Here we show how specific removal of non-BOLD artifacts can improve effect size estimation and statistical power in task-fMRI contexts, with particular application to the social-cognitive domain of mentalizing/theory of mind. Non-BOLD variability identification and removal is achieved in a biophysical and statistically principled manner by combining multi-echo fMRI acquisition and independent components analysis (ME-ICA). Without smoothing, group-level effect size estimates on two different mentalizing tasks were enhanced by ME-ICA at a median rate of 24% in regions canonically associated with mentalizing, while much more substantial boosts (40-149%) were observed in non-canonical cerebellar areas. Effect size boosting occurs via reduction of non-BOLD noise at the subject-level and consequent reductions in between-subject variance at the group-level. Smoothing can attenuate ME-ICA-related effect size improvements in certain circumstances. Power simulations demonstrate that ME-ICA-related effect size enhancements enable much higher-powered studies at traditional sample sizes. Cerebellar effects observed after applying ME-ICA may be unobservable with conventional imaging at traditional sample sizes. Thus, ME-ICA allows for principled design-agnostic non-BOLD artifact removal that can substantially improve effect size estimates and statistical power in task-fMRI contexts. ME-ICA could mitigate some issues regarding statistical power in fMRI studies and enable novel discovery of aspects of brain organization that are currently under-appreciated and not well understood. Copyright © 2016 The Authors. Published

  1. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  2. Assessment of regional metal levels in ambient air by statistical regression models.

    Science.gov (United States)

    Arruti, A; Fernández-Olmo, I; Irabien, A

    2011-07-01

    The assessment of the particulate matter (PM) levels and its constituents presented in the atmosphere is an important requirement of the air quality management and air pollution abatement. The heavy metal levels in PM10 are commonly evaluated by experimental measurements; nevertheless, the EC Directives also allow the Regional Governments to estimate the regulated metal levels (Pb in Directive 2008/50/EC and As, Ni and Cd in Directive 2004/107/EC) by objective estimation and modelling techniques. These techniques are proper alternatives to the experimental determination because the required analysis and/or the number of required sampling sites are reduced. The present work aims to estimate the annual levels of regulated heavy metals by means of multivariate linear regression (MLR) and principal component regression (PCR) at four sites in the Cantabria region (Northern Spain). Since the objective estimation techniques may only be applied when the regulated metal concentrations are below to the lower assessment threshold, a previous evaluation of the determined annual levels of heavy metals is conducted to test the fulfilment of the EC Directives requirements. At the four studied sites, the results show that the objective estimations are allowed alternatives to the experimental determination. The annual average metal concentrations are well estimated by the MLR technique in all the studied sites; furthermore, the EC quality requirements for the objective estimations are fulfilled by the developed statistical MLR models. Hence these estimations may be used by Regional Governments as a proper alternative to the experimental measurements for the regulated metal levels assessment.

  3. Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Lee, Andrew K. [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Sahoo, Narayan [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tucker, Susan L. [Department of Bioinformatics and Computational Biology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei, E-mail: dong.lei@scrippshealth.org [Scripps Proton Therapy Center, San Diego, California (United States)

    2013-08-01

    Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dose–volume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dose–volume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of −1.1% (−0.9% for breath-hold), −0.3%, and −2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.

  4. Energy Storage for Power Systems Applications: A Regional Assessment for the Northwest Power Pool (NWPP)

    Energy Technology Data Exchange (ETDEWEB)

    Kintner-Meyer, Michael CW; Balducci, Patrick J.; Jin, Chunlian; Nguyen, Tony B.; Elizondo, Marcelo A.; Viswanathan, Vilayanur V.; Guo, Xinxin; Tuffner, Francis K.

    2010-04-01

    Wind production, which has expanded rapidly in recent years, could be an important element in the future efficient management of the electric power system; however, wind energy generation is uncontrollable and intermittent in nature. Thus, while wind power represents a significant opportunity to the Bonneville Power Administration (BPA), integrating high levels of wind resources into the power system will bring great challenges to generation scheduling and in the provision of ancillary services. This report addresses several key questions in the broader discussion on the integration of renewable energy resources in the Pacific Northwest power grid. More specifically, it addresses the following questions: a) how much total reserve or balancing requirements are necessary to accommodate the simulated expansion of intermittent renewable energy resources during the 2019 time horizon, and b) what are the most cost effective technological solutions for meeting load balancing requirements in the Northwest Power Pool (NWPP).

  5. Assessment and analysis of wind energy generation and power ...

    African Journals Online (AJOL)

    This study concerns the evaluation of wind power potential and the choice of a wind turbine to be installed near Rabah Bitat international airport of Annaba. Furthermore, the performances of power control of this turbine are developed. For this, the wind speed data measured by meteorological station of th e airport are used.

  6. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    Science.gov (United States)

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  7. Power

    OpenAIRE

    Bowles, Samuel; Gintis, Herbert

    2007-01-01

    We consider the exercise of power in competitive markets for goods, labour and credit. We offer a definition of power and show that if contracts are incomplete it may be exercised either in Pareto-improving ways or to the disadvantage of those without power. Contrasting conceptions of power including bargaining power, market power, and consumer sovereignty are considered. Because the exercise of power may alter prices and other aspects of exchanges, abstracting from power may miss essential a...

  8. Combining the Power of Statistical Analyses and Community Interviews to Identify Adoption Barriers for Stormwater Best-Management Practices

    Science.gov (United States)

    Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.

    2015-12-01

    Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.

  9. Fuel consumption and fire emissions estimates using Fire Radiative Power, burned area and statistical modelling on the fire event scale

    Science.gov (United States)

    Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja

    2016-04-01

    Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.

  10. Extreme weather exposure identification for road networks - a comparative assessment of statistical methods

    Science.gov (United States)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of

  11. Assessment of nuclear reactor concepts for low power space applications

    Science.gov (United States)

    Klein, Andrew C.; Gedeon, Stephen R.; Morey, Dennis C.

    1988-01-01

    The results of a preliminary small reactor concepts feasibility and safety evaluation designed to provide a first order validation of the nuclear feasibility and safety of six small reactor concepts are given. These small reactor concepts have potential space applications for missions in the 1 to 20 kWe power output range. It was concluded that low power concepts are available from the U.S. nuclear industry that have the potential for meeting both the operational and launch safety space mission requirements. However, each design has its uncertainties, and further work is required. The reactor concepts must be mated to a power conversion technology that can offer safe and reliable operation.

  12. Green Power Marketing in Retail Competition: An Early Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, R. (LBL); Fang, J.; Porter, K.; Houston, A. (NREL)

    1999-02-26

    Green power marketing-the business of selling electricity products or services based in part on their environmental values-is still in an early stage of development. This Topical Issues Brief presents a summary of early results with green power marketing under retail competition, covering both fully competitive markets and relevant direct access pilot programs. The brief provides an overview of green products that are or were offered, and discusses consumers' interest in these products. Critical issues that will impact the availability and success of green power products under retail competition are highlighted.

  13. Application of multivariate statistical technique for hydrogeochemical assessment of groundwater within the Lower Pra Basin, Ghana

    Science.gov (United States)

    Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.

    2017-06-01

    Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal

  14. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  15. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  16. Assessment of Seismic Vulnerability of Steel and RC Moment Buildings Using HAZUS and Statistical Methodologies

    Directory of Open Access Journals (Sweden)

    Iman Mansouri

    2017-01-01

    Full Text Available Designer engineers have always the serious challenge regarding the choice of the kind of structures to use in the areas with significant seismic activities. Development of fragility curve provides an opportunity for designers to select a structure that will have the least fragility. This paper presents an investigation into the seismic vulnerability of both steel and reinforced concrete (RC moment frames using fragility curves obtained by HAZUS and statistical methodologies. Fragility curves are employed for several probability parameters. Fragility curves are used to assess several probability parameters. Furthermore, it examines whether the probability of the exceedence of the damage limit state is reduced as expected. Nonlinear dynamic analyses of five-, eight-, and twelve-story frames are carried out using Perform 3D. The definition of damage states is based on the descriptions provided by HAZUS, which gives the limit states and the associated interstory drift limits for structures. The fragility curves show that the HAZUS procedure reduces probability of damage, and this reduction is higher for RC frames. Generally, the RC frames have higher fragility compared to steel frames.

  17. Climatic change of summer temperature and precipitation in the Alpine region - a statistical-dynamical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Heimann, D.; Sept, V.

    1998-12-01

    Climatic changes in the Alpine region due to increasing greenhouse gas concentrations are assessed by using statistical-dynamical downscaling. The downscaling procedure is applied to two 30-year periods (1971-2000 and 2071-2100, summer months only) of the output of a transient coupled ocean/atmosphere climate scenario simulation. The downscaling results for the present-day climate are in sufficient agreement with observations. The estimated regional climate change during the next 100 years shows a general warming. The mean summer temperatures increase by about 3 to more than 5 Kelvin. The most intense climatic warming is predicted in the western parts of the Alps. The amount of summer precipitation decreases in most parts of central Europe by more than 20 percent. Only over the Adriatic area and parts of eastern central Europe an increase in precipitation is simulated. The results are compared with observed trends and results of regional climate change simulations of other authors. The observed trends and the majority of the simulated trends agree with our results. However, there are also climate change estimates which completely contradict with ours. (orig.) 29 refs.

  18. Power Quality assessment and mitigation at Walya-Steel Industries ...

    African Journals Online (AJOL)

    ) and Ethio-Plastic Share Company (EPSC), which are customers of Ethiopian Electric Power Corporation (EEPCo), are studied. PQ indicators of the two industries are recorded and analyzed. The study demonstrated that harmonic distortion is ...

  19. Assessment of flywheel energy storage for spacecraft power systems

    Science.gov (United States)

    Rodriguez, G. E.; Studer, P. A.; Baer, D. A.

    1983-01-01

    The feasibility of inertial energy storage in a spacecraft power system is evaluated on the basis of a conceptual integrated design that encompasses a composite rotor, magnetic suspension, and a permanent magnet (PM) motor/generator for a 3-kW orbital average payload at a bus distribution voltage of 250 volts dc. The conceptual design, which evolved at the Goddard Space Flight Center (GSFC), is referred to as a Mechanical Capacitor. The baseline power system configuration selected is a series system employing peak-power-tracking for a Low Earth-Orbiting application. Power processing, required in the motor/generator, provides a potential alternative configurations that can only be achieved in systems with electrochemical energy storage by the addition of power processing components. One such alternative configuration provides for peak-power-tracking of the solar array and still maintains a regulated bus, without the expense of additional power processing components. Precise speed control of the two counterrotating wheels is required to reduce interaction with the attitude control system (ACS) or alternatively, used to perform attitude control functions. Critical technologies identified are those pertaining to the energy storage element and are prioritized as composite wheel development, magnetic suspension, motor/generator, containment, and momentum control. Comparison with a 3-kW, 250-Vdc power system using either NiCd or NiH2 for energy storage results in a system in which inertial energy storage offers potential advantages in lifetime, operating temperature, voltage regulation, energy density, charge control, and overall system weight reduction.

  20. 75 FR 12311 - Entergy Nuclear Operations, Inc; Vermont Yankee Nuclear Power Station Environmental Assessment...

    Science.gov (United States)

    2010-03-15

    ... COMMISSION Entergy Nuclear Operations, Inc; Vermont Yankee Nuclear Power Station Environmental Assessment and... Nuclear Operations, Inc. (Entergy or the licensee), for operation of Vermont Yankee Nuclear Power Station... Statement for Vermont Yankee Nuclear Power Station, Docket No. 50-271, dated July 1972, as supplemented...

  1. Dynamic Security Assessment of Danish Power System Based on Decision Trees: Today and Tomorrow

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Liu, Leo; Chen, Zhe

    2013-01-01

    The research work presented in this paper analyzes the impact of wind energy, phasing out of central power plants and cross border power exchange on dynamic security of Danish Power System. Contingency based decision tree (DT) approach is used to assess the dynamic security of present and future...

  2. 75 FR 11575 - James A. Fitzpatrick Nuclear Power Plant Environmental Assessment and Finding of No Significant...

    Science.gov (United States)

    2010-03-11

    ... COMMISSION James A. Fitzpatrick Nuclear Power Plant Environmental Assessment and Finding of No Significant... Program for Nuclear Power Facilities Operating Prior to January 1, 1979,'' issued to Entergy Nuclear Operations, Inc. (the licensee), for the operation of the James A. FitzPatrick Nuclear Power Plant (JAFNPP...

  3. Condition assessment of power cables using partial discharge diagnosis at damped AC voltages

    NARCIS (Netherlands)

    Wester, F.J.

    2004-01-01

    The thesis focuses on the condition assessment of the distribution power cables, which have a very critical part in the distribution of electrical power over regional distances. The majority of the outages in the power system is related to the distribution cables, of which for more than 60% to

  4. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment.

    Science.gov (United States)

    Henriksson, Patrik J G; Rico, Andreu; Zhang, Wenbo; Ahmad-Al-Nahid, Sk; Newton, Richard; Phan, Lam T; Zhang, Zongfeng; Jaithiang, Jintana; Dao, Hai M; Phu, Tran M; Little, David C; Murray, Francis J; Satapornvanit, Kriengkrai; Liu, Liping; Liu, Qigen; Haque, M Mahfujul; Kruijssen, Froukje; de Snoo, Geert R; Heijungs, Reinout; van Bodegom, Peter M; Guinée, Jeroen B

    2015-12-15

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aquatic food products. Our starting hypothesis was that different production systems are associated with significantly different environmental impacts, as the production of these aquatic species differs in intensity and management practices. In order to test this hypothesis, we estimated each system's global warming, eutrophication, and freshwater ecotoxicity impacts. The contribution to these impacts and the overall dispersions relative to results were propagated by Monte Carlo simulations and dependent sampling. Paired testing showed significant (p production systems in the intraspecies comparisons, even after a Bonferroni correction. For the full distributions instead of only the median, only for Asian tiger shrimp did more than 95% of the propagated Monte Carlo results favor certain farming systems. The major environmental hot-spots driving the differences in environmental performance among systems were fishmeal from mixed fisheries for global warming, pond runoff and sediment discards for eutrophication, and agricultural pesticides, metals, benzalkonium chloride, and other chlorine-releasing compounds for freshwater ecotoxicity. The Asian aquaculture industry should therefore strive toward farming systems relying upon pelleted species-specific feeds, where the fishmeal inclusion is limited and sourced sustainably. Also, excessive nutrients should be recycled in integrated organic agriculture together with efficient aeration solutions powered by renewable energy sources.

  5. Assessment and mitigation of power quality problems for PUSPATI TRIGA Reactor (RTP)

    Science.gov (United States)

    Zakaria, Mohd Fazli; Ramachandaramurthy, Vigna K.

    2017-01-01

    An electrical power systems are exposed to different types of power quality disturbances. Investigation and monitoring of power quality are necessary to maintain accurate operation of sensitive equipment especially for nuclear installations. This paper will discuss the power quality problems observed at the electrical sources of PUSPATI TRIGA Reactor (RTP). Assessment of power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards) then, if problems exist, recommendation of mitigation techniques must be considered. Field power quality data is collected by power quality recorder and analyzed with reference to power quality standards. Normally the electrical power is supplied to the RTP via two sources in order to keep a good reliability where each of them is designed to carry the full load. The assessment of power quality during reactor operation was performed for both electrical sources. There were several disturbances such as voltage harmonics and flicker that exceeded the thresholds. To reduce these disturbances, mitigation techniques have been proposed, such as to install passive harmonic filters to reduce harmonic distortion, dynamic voltage restorer (DVR) to reduce voltage disturbances and isolate all sensitive and critical loads.

  6. An application and verification of ensemble forecasting on wind power to assess operational risk indicators in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, S.; Ciapessoni, E.; Cirio, D.; Pitto, A.; Sperati, S. [Ricerca sul Sistema Energetico RSE S.p.A., Milan (Italy). Power System Development Dept. and Environment and Sustainable Development Dept.; Pinson, P. [Technical University of Denmark, Lyngby (Denmark). DTU Informatics

    2012-07-01

    Wind energy is part of the so-called not schedulable renewable sources, i.e. it must be exploited when it is available, otherwise it is lost. In European regulation it has priority of dispatch over conventional generation, to maximize green energy production. However, being variable and uncertain, wind (and solar) generation raises several issues for the security of the power grids operation. In particular, Transmission System Operators (TSOs) need as accurate as possible forecasts. Nowadays a deterministic approach in wind power forecasting (WPF) could easily be considered insufficient to face the uncertainty associated to wind energy. In order to obtain information about the accuracy of a forecast and a reliable estimation of its uncertainty, probabilistic forecasting is becoming increasingly widespread. In this paper we investigate the performances of the COnsortium for Small-scale MOdelling Limited area Ensemble Prediction System (COSMO-LEPS). First the ensemble application is followed by assessment of its properties (i.e. consistency, reliability) using different verification indices and diagrams calculated on wind power. Then we provide examples of how EPS based wind power forecast can be used in power system security analyses. Quantifying the forecast uncertainty allows to determine more accurately the regulation reserve requirements, hence improving security of operation and reducing system costs. In particular, the paper also presents a probabilistic power flow (PPF) technique developed at RSE and aimed to evaluate the impact of wind power forecast accuracy on the probability of security violations in power systems. (orig.)

  7. Life cycle assessment of fossil and biomass power generation chains. An analysis carried out for ALSTOM Power Services

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Ch.

    2008-12-15

    This final report issued by the Technology Assessment Department of the Paul Scherrer Institute (PSI) reports on the results of an analysis carried out on behalf of the Alstom Power Services company. Fossil and biomass chains as well as co-combustion power plants are assessed. The general objective of this analysis is an evaluation of specific as well as overall environmental burdens resulting from these different options for electricity production. The results obtained for fuel chains including hard coal, lignite, wood, natural gas and synthetic natural gas are discussed. An overall comparison is made and the conclusions drawn from the results of the analysis are presented.

  8. Satellite Power Systems (SPS): Concept development and evaluation program: Preliminary assessment

    Science.gov (United States)

    1979-01-01

    A preliminary assessment of a potential Satellite Power System (SPS) is provided. The assessment includes discussion of technical and economic feasibility; the effects of microwave power transmission beams on biological, ecological, and electromagnetic systems; the impact of SPS construction, deployment, and operations on the biosphere and on society; and the merits of SPS compared to other future energy alternatives.

  9. 75 FR 20991 - Upper Peninsula Power Company; Notice of Availability of Environmental Assessment

    Science.gov (United States)

    2010-04-22

    ... Energy Regulatory Commission Upper Peninsula Power Company; Notice of Availability of Environmental Assessment April 15, 2010. In accordance with the National Environmental Policy Act of 1969 and the Federal... environmental assessment (EA) regarding Upper Peninsula Power Company's plan to replace the spillway at the Bond...

  10. Area Based Approach for Three Phase Power Quality Assessment in Clarke Plane

    OpenAIRE

    S. CHATTOPADHYAY; M. MITRA; S. SENGUPTA

    2008-01-01

    This paper presents an area-based approach for electric power quality analysis. Some specific reference signals have been defined and areas formed by the real power system data with the reference signal have been calculated wherefrom contributions of fundamental waveform and harmonic components have been assessed separately. Active power, reactive power and total harmonic distortion factors have been measured. Clarke transformation technique has been used for analysis in three-phase system, w...

  11. Potential of wind power for Thailand: an assessment

    Directory of Open Access Journals (Sweden)

    Terry Commins

    2008-03-01

    Full Text Available This paper reviews the potential for wind-power generated electricity in Thailand by means of a wide-ranging literature survey. Proposed application at a university campus is used as a case study to demonstrate that wind power is unlikely to be economically competitive where grid-connected electricity is available. The need for improved low wind speed turbine performance for Thai applications is highlighted by comparing the output of commercially available wind turbines with the characteristics of Thai wind; the challenges of improving low wind speed turbine performance are discussed. It is concluded that for Thailand in the foreseeable future the benefits of economic wind power electricity generation will probably be confined to small remote isolated installations including traditional applications.

  12. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must

  13. Enhancing an Undergraduate Business Statistics Course: Linking Teaching and Learning with Assessment Issues

    Science.gov (United States)

    Fairfield-Sonn, James W.; Kolluri, Bharat; Rogers, Annette; Singamsetti, Rao

    2009-01-01

    This paper examines several ways in which teaching effectiveness and student learning in an undergraduate Business Statistics course can be enhanced. First, we review some key concepts in Business Statistics that are often challenging to teach and show how using real data sets assist students in developing deeper understanding of the concepts.…

  14. Using Innovative Statistical Analyses to Assess Soil Degradation due to Land Use Change

    Science.gov (United States)

    Khaledian, Yones; Kiani, Farshad; Ebrahimi, Soheila; Brevik, Eric C.; Aitkenhead-Peterson, Jacqueline

    2016-04-01

    Soil erosion and overall loss of soil fertility is a serious issue for loess soils of the Golestan province, northern Iran. The assessment of soil degradation at large watershed scales is urgently required. This research investigated the role of land use change and its effect on soil degradation in cultivated, pasture and urban lands, when compared to native forest in terms of declines in soil fertility. Some novel statistical methods including partial least squares (PLS), principal component regression (PCR), and ordinary least squares regression (OLS) were used to predict soil cation-exchange capacity (CEC) using soil characteristics. PCA identified five primary components of soil quality. The PLS model was used to predict soil CEC from the soil characteristics including bulk density (BD), electrical conductivity (EC), pH, calcium carbonate equivalent (CCE), soil particle density (DS), mean weight diameter (MWD), soil porosity (F), organic carbon (OC), Labile carbon (LC), mineral carbon, saturation percentage (SP), soil particle size (clay, silt and sand), exchangeable cations (Ca2+, Mg2+, K+, Na+), and soil microbial respiration (SMR) collected in the Ziarat watershed. In order to evaluate the best fit, two other methods, PCR and OLS, were also examined. An exponential semivariogram using PLS predictions revealed stronger spatial dependence among CEC [r2 = 0.80, and RMSE= 1.99] than the other methods, PCR [r2 = 0.84, and RMSE= 2.45] and OLS [r2 = 0.84, and RMSE= 2.45]. Therefore, the PLS method provided the best model for the data. In stepwise regression analysis, MWD and LC were selected as influential variables in all soils, whereas the other influential parameters were different in various land uses. This study quantified reductions in numerous soil quality parameters resulting from extensive land-use changes and urbanization in the Ziarat watershed in Northern Iran.

  15. Multivariate statistical assessment of a polluted river under nitrification inhibition in the tropics.

    Science.gov (United States)

    Le, Thi Thu Huyen; Zeunert, Stephanie; Lorenz, Malte; Meon, Günter

    2017-05-01

    A large complex water quality data set of a polluted river, the Tay Ninh River, was evaluated to identify its water quality problems, to assess spatial variation, to determine the main pollution sources, and to detect relationships between parameters. This river is highly polluted with organic substances, nutrients, and total iron. An important problem of the river is the inhibition of the nitrification. For the evaluation, different statistical techniques including cluster analysis (CA), discriminant analysis (DA), and principal component analysis (PCA) were applied. CA clustered 10 water quality stations into three groups corresponding to extreme, high, and moderate pollution. DA used only seven parameters to differentiate the defined clusters. The PCA resulted in four principal components. The first PC is related to conductivity, NH4-N, PO4-P, and TP and determines nutrient pollution. The second PC represents the organic pollution. The iron pollution is illustrated in the third PC having strong positive loadings for TSS and total Fe. The fourth PC explains the dependence of DO on the nitrate production. The nitrification inhibition was further investigated by PCA. The results showed a clear negative correlation between DO and NH4-N and a positive correlation between DO and NO3-N. The influence of pH on the NH4-N oxidation could not be detected by PCA because of the very low nitrification rate due to the constantly low pH of the river and because of the effect of wastewater discharge with very high NH4-N concentrations. The results are deepening the understanding of the governing water quality processes and hence to manage the river basins sustainably.

  16. Sea piracy sequelae: assessment according to the Diagnostic and Statistical Manual of Mental Disorders-5.

    Science.gov (United States)

    Ziello, Antonio Rosario; Fasanaro, Angiola Maria; Petrelli, Cristina; Ricci, Giovanna; Sirignano, Ascanio; Amenta, Francesco

    2014-01-01

    Our previous studies have investigated the psychological consequences of kidnapping in a group of Italian seafarers assaulted by sea pirates and held in captivity and in their family members by the criteria of the Diagnostic and Statistical Manual of Mental Disorders (DSM)-4. These studies have shown that both the victims and the family members showed significant psychological disturbances, corresponding to a chronic Post-Traumatic Stress Disorder (PTSD), in the victims, and a pattern of anxiety and depression in their family members. After publication of these studies, an updated edition of the DSM became available, namely, the DSM-5. The DSM-5 redefines some diagnostic criteria, including those related to the PTSD. This work was focused on the re-evaluation of the results of our previous studies in the light of the DSM-5 diagnostic criteria. Sixteen Italians including 4 kidnapped seafarers and 12 family members were examined by a semi-structured interview followed by Clinician-Administered PTSD Scale (CAPS-DX) and the Cognitive Behaviour al Assessment (CBA 2.0) for victims and by State-Trait Anxiety Inventory (STAI) X-1 and X-2 of CBA 2.0 and the Hamilton Depression Rating Scale (HDRS) for family members. Data already obtained were reviewed and re-analysed according to the DSM-5 criteria and the Clinician-Administered PTSD Scale for DSM-5 (CAPS-5). The use of the CAPS-5 did not modify the diagnosis for the victims' group: 3 of 4 had a PTSD diagnosis performed through the CAPS-5. Seven of 12 family members had PTSD diagnosis performed through the CAPS-5, with negative cognitions and mood symptoms being those obtaining the highest score. Using DSM-5 criteria, the diagnosis of PTSD in the direct victims of piracy was confirmed. The same diagnosis could apply to a group of their family members. Besides anxiety and fear, in fact, we found in 7 out 12 subjects the presence of symptoms included by the DSM-5 in the PTSD spectrum. These symptoms were: avoidance, negative

  17. National Hydroelectric Power Resources Study: Environmental Assessment. Volume 8

    Science.gov (United States)

    1981-09-01

    bioaccumulation related to pesticides and industrial chemicals in the Great Lakes region and other areas. In addition, anoxic conditions from thermal... hydrothermal power program." DOE/EIS-0066. U.S. Government Printing Office, Washington, D.C. U.S. Department of Energy, January 1979a. "A Review of State

  18. Integrated safety assessment of Indian nuclear power plants for ...

    Indian Academy of Sciences (India)

    Abstract. Nuclear energy professionals need to understand and address the catas- trophe syndrome that of late seems to be increasingly at work in public mind in the context of nuclear energy. Classically the nuclear power reactor design and system evolution has been based on the logic of minimization of risk to an ...

  19. Journal of EEA, Vol. 31, 2014 ASSESSMENT OF POWER ...

    African Journals Online (AJOL)

    This paper presents the use of smart reclosers for improving reliability of a distribution system of one of the major cities of Ethiopia. As frequent power interruptions are posing a huge problem to the life of the people and the economy, finding a solution to the problem is very essential. Electric reliability has affected social well ...

  20. Assessment of power reliability and improvement potential by using ...

    African Journals Online (AJOL)

    This paper presents the use of smart reclosers for improving reliability of a distribution system of one of the major cities of Ethiopia. As frequent power interruptions are posing a huge problem to the life of the people and the economy, finding a solution to the problem is very essential. Electric reliability has affected social well ...

  1. Assessment and analysis of wind energy generation and power ...

    African Journals Online (AJOL)

    Keywords: wind energy- wind speed- Weibull distribution-capacity factor-power output. Résumé ... greenhouse effect. According to a publication of the international agency of the energy, the world production of electricity should double during the next 25 years. Hence, to ... reduction of the fossils use and to reduce carbon.

  2. Economic Impact Assessment of Wind Power Integration: A Quasi-Public Goods Property Perspective

    Directory of Open Access Journals (Sweden)

    Huiru Zhao

    2015-08-01

    Full Text Available The integration of wind power into power grid will bring some impacts on the multiple subjects of electric power system. Economic impacts of wind power integration on multiple subjects of China’s electric power system were quantitatively assessed from Quasi-public goods property perspective in this paper. Firstly, the Quasi-public goods property of transmission services provided by power grid corporations was elaborated. Secondly, the multiple subjects of China’s electric power system, which include electricity generation enterprises (EGEs, power grid corporations (PGCs, electricity consumers (ECs, and environment, were detailed analyzed. Thirdly, based on the OPF-based nodal price model and transmission service cost allocation model, the economic impact assessment model of wind power integration was built from Quasi-public goods property perspective. Then, the IEEE-24 bus system employed in this paper was introduced according to current status of China’s electric power system, and the modeling of wind turbine was also introduced. Finally, the simulation analysis was performed, and the economic impacts of wind power integration on EGEs, PGCs, ECs and Environment were calculated. The results indicate, from Quasi-public goods property perspective, the wind power integration will bring positive impacts on EGEs, PGCs and Environment, while negative impacts on ECs. The findings can provide references for power system managers, energy planners, and policy makers.

  3. Considerations on Cyber Security Assessments of Korean Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung-Woon; Song, Jae-Gu; Han, Kyung-Soo; Lee, Cheol Kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kang, Mingyun [E-Gonggam Co. Ltd., Daejeon (Korea, Republic of)

    2015-10-15

    Korea Institute of Nuclear Nonproliferation and Control (KINAC) has prepared the regulatory standard RS-015 based on RG 5.71. RS-015 defines the elements of a cyber security program to be established in nuclear facilities and describes the security control items and relevant requirements. Cyber security assessments are important initial activities in a cyber security program for NPPs. Cyber security assessments can be performed in the following key steps: 1) Formation of a cyber security assessment team (CSAT); 2) Identification of critical systems and critical digital assets (CDAs); 3) Plant compliance checks with the security control requirements in RS-015. Through the assessments, the current status of security controls applied to NPPs can be found out. The assessments provide baseline data for remedial activities. Additional analyses with the results from the assessments should be performed before the implementation of remedial security controls. The cyber security team at the Korea Atomic Energy Research Institute (KAERI) has studied how to perform cyber security assessments for NPPs based on the regulatory requirements. Recently, KAERI's cyber security team has performed pilot cyber security assessments of a Korean NPP. Based on this assessment experience, considerations and checkpoints which would be helpful for full-scale cyber security assessments of Korean NPPs and the implementation of remedial security controls are discussed in this paper. Cyber security assessment is one of important and immediate activities for NPP cyber security. The quality of the first assessment will be a barometer for NPP cyber security. Hence cyber security assessments of Korean NPPs should be performed elaborately.

  4. Environmental Impact Assessment for Olkiluoto 4 Nuclear Power Plant Unit in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Dersten, Riitta; Gahmberg, Sini; Takala, Jenni [Teollisuuden Voima Oyj, Olkiluoto, FI-27160 Eurajoki (Finland)

    2008-07-01

    In order to improve its readiness for constructing additional production capacity, Teollisuuden Voima Oyj (TVO) initiated in spring 2007 the environmental impact assessment procedure (EIA procedure) concerning a new nuclear power plant unit that would possibly be located at Olkiluoto. When assessing the environmental impacts of the Olkiluoto nuclear power plant extension project, the present state of the environment was first examined, and after that, the changes caused by the projects as well as their significance were assessed, taking into account the combined impacts of the operations at Olkiluoto. The environmental impact assessment for the planned nuclear power plant unit covers the entire life cycle of the plant unit. (authors)

  5. Assessing climate change impacts on the Iberian power system using a coupled water-power model

    DEFF Research Database (Denmark)

    Cardenal, Silvio Javier Pereira; Madsen, Henrik; Arnbjerg-Nielsen, Karsten

    2014-01-01

    patterns of electricity demand caused by temperature changes, and changes in irrigation water use caused by temperature and precipitation changes. A stochastic dynamic programming approach was used to develop operating rules for the integrated system given hydrological uncertainty. We found that changes......Climate change is expected to have a negative impact on the power system of the Iberian Peninsula; changes in river runoff are expected to reduce hydropower generation, while higher temperatures are expected to increase summer electricity demand, when water resources are already limited. However......, these impacts have not yet been evaluated at the peninsular level. We coupled a hydrological model with a power market model to study three impacts of climate change on the current Iberian power system: changes in hydropower production caused by changes in precipitation and temperature, changes in temporal...

  6. Hazard Identification, Risk Assessment and Risk Control (HIRARC Accidents at Power Plant

    Directory of Open Access Journals (Sweden)

    Ahmad Asmalia Che

    2016-01-01

    Full Text Available Power plant had a reputation of being one of the most hazardous workplace environments. Workers in the power plant face many safety risks due to the nature of the job. Although power plants are safer nowadays since the industry has urged the employer to improve their employees’ safety, the employees still stumble upon many hazards thus accidents at workplace. The aim of the present study is to investigate work related accidents at power plants based on HIRARC (Hazard Identification, Risk Assessment and Risk Control process. The data were collected at two coal-fired power plant located in Malaysia. The finding of the study identified hazards and assess risk relate to accidents occurred at the power plants. The finding of the study suggested the possible control measures and corrective actions to reduce or eliminate the risk that can be used by power plant in preventing accidents from occurred

  7. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    Science.gov (United States)

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online.

  8. Undersampling power-law size distributions: effect on the assessment of extreme natural hazards

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.

    2014-01-01

    The effect of undersampling on estimating the size of extreme natural hazards from historical data is examined. Tests using synthetic catalogs indicate that the tail of an empirical size distribution sampled from a pure Pareto probability distribution can range from having one-to-several unusually large events to appearing depleted, relative to the parent distribution. Both of these effects are artifacts caused by limited catalog length. It is more difficult to diagnose the artificially depleted empirical distributions, since one expects that a pure Pareto distribution is physically limited in some way. Using maximum likelihood methods and the method of moments, we estimate the power-law exponent and the corner size parameter of tapered Pareto distributions for several natural hazard examples: tsunamis, floods, and earthquakes. Each of these examples has varying catalog lengths and measurement thresholds, relative to the largest event sizes. In many cases where there are only several orders of magnitude between the measurement threshold and the largest events, joint two-parameter estimation techniques are necessary to account for estimation dependence between the power-law scaling exponent and the corner size parameter. Results indicate that whereas the corner size parameter of a tapered Pareto distribution can be estimated, its upper confidence bound cannot be determined and the estimate itself is often unstable with time. Correspondingly, one cannot statistically reject a pure Pareto null hypothesis using natural hazard catalog data. Although physical limits to the hazard source size and by attenuation mechanisms from source to site constrain the maximum hazard size, historical data alone often cannot reliably determine the corner size parameter. Probabilistic assessments incorporating theoretical constraints on source size and propagation effects are preferred over deterministic assessments of extreme natural hazards based on historic data.

  9. Quaterly Assessment of Irradiance Variation on Power Output and Storable Excess Power of Solar Panels

    National Research Council Canada - National Science Library

    T O Familusi; Y K Sanusi; H O Efunwole; A M Raimi

    2014-01-01

      This paper verified the input solar irradiance and average power output per day of a 10W polycrystalline silicon solar panel and a 10W gallium arsenide solar panel, both of dimension 350x290x25mm^sup 3^, fill-factor...

  10. ASSESSMENT OF COMBINED HEAT AND POWER SYSTEM"PREMIUM POWER" APPLICATIONS IN CALIFORNIA

    Energy Technology Data Exchange (ETDEWEB)

    Norwood, Zack; Lipman, Timothy; Stadler, Michael; Marnay, Chris

    2010-06-01

    The effectiveness of combined heat and power (CHP) systems for power interruption intolerant,"premium power," facilities is the focus of this study. Through three real-world case studies and economic cost minimization modeling, the economic and environmental performance of"premium power" CHP is analyzed. The results of the analysis for a brewery, data center, and hospital lead to some interesting conclusions about CHP limited to the specific CHP technologies installed at those sites. Firstly, facilities with high heating loads prove to be the most appropriate for CHP installations from a purely economic standpoint. Secondly, waste heat driven thermal cooling systems are only economically attractive if the technology for these chillers can increase above the current best system efficiency. Thirdly, if the reliability of CHP systems proves to be as high as diesel generators they could replace these generators at little or no additional cost if the thermal to electric (relative) load of those facilities was already high enough to economically justify a CHP system. Lastly, in terms of greenhouse gas emissions, the modeled CHP systems provide some degree of decreased emissions, estimated at approximately 10percent for the hospital, the application with the highest relative thermal load in this case

  11. The Power of the Cloud: Google Forms for Transition Assessment

    Science.gov (United States)

    Scheef, Andrew R.; Johnson, Cinda

    2017-01-01

    The inclusion of age-appropriate transition assessments is a key component of transition services for students with disabilities. Although these assessments may focus on a variety of areas, their general purpose is to provide guidance in developing individualized postschool goals and design transition services to help students achieve these goals.…

  12. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Matthew J. Tonkin; Claire R. Tiedeman; D. Matthew Ely; and Mary C. Hill

    2007-08-16

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  13. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Science.gov (United States)

    Tonkin, Matthew J.; Tiedeman, Claire R.; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  14. Insulation Condition Assessment of Power Transformers Using Accelerated Ageing Tests

    OpenAIRE

    MIRZAIE, Mohammad; Gholami, Ahmad; TAYEBI, Hamid Reza

    2009-01-01

    Thermal stress due to losses and environment temperature causes degradation to paper/oil insulation systems in transformers, even at operating temperature. Experience indicates that thermal ageing of oil and paper in power transformers leads to the change of some insulation characteristics. In this paper, insulating papers immersed in oil have been acceleratory aged at 140, 150, and 160 °C under laboratory conditions. Some of the oil properties, such as water content, breakdown vo...

  15. Assessing the Army Power and Energy Efforts for the Warfighter

    Science.gov (United States)

    2011-03-01

    finds that steam reforming of methane to make methanol is the most attractive alternative. Methanol can then be transformed into gasoline. The various...been slow but steady. The Army is studying several designs, including direct methanol , reformed methanol , and polymer exchange membrane cells. Cells...operating on methanol without reforming appear to be of most interest. Current status is 30 W/kg (available power draw) and 1,000 Wh/kg (energy

  16. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    Science.gov (United States)

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  17. Assessment of Microbial Fuel Cell Configurations and Power Densities

    KAUST Repository

    Logan, Bruce E.

    2015-07-30

    Different microbial electrochemical technologies are being developed for a many diverse applications, including wastewater treatment, biofuel production, water desalination, remote power sources, and as biosensors. Current and energy densities will always be limited relative to batteries and chemical fuel cells, but these technologies have other advantages based on the self-sustaining nature of the microorganisms that can donate or accept electrons from an electrode, the range of fuels that can be used, and versatility in the chemicals that can be produced. The high cost of membranes will likely limit applications of microbial electrochemical technologies that might require a membrane. For microbial fuel cells, which do not need a membrane, questions remain on whether larger-scale systems can produce power densities similar to those obtained in laboratory-scale systems. It is shown here that configuration and fuel (pure chemicals in laboratory media versus actual wastewaters) remain the key factors in power production, rather than the scale of the application. Systems must be scaled up through careful consideration of electrode spacing and packing per unit volume of reactor.

  18. Assessment of distributed solar power systems: Issues and impacts

    Science.gov (United States)

    Moyle, R. A.; Chernoff, H.; Schweizer, T. C.; Patton, J. B.

    1982-11-01

    The installation of distributed solar-power systems presents electric utilities with a host of questions. Some of the technical and economic impacts of these systems are discussed. Among the technical interconnect issues are isolated operation, power quality, line safety, and metering options. Economic issues include user purchase criteria, structures and installation costs, marketing and product distribution costs, and interconnect costs. An interactive computer program that allows easy calculation of allowable system prices and allowable generation-equipment prices was developed as part of this project. It is concluded that the technical problems raised by distributed solar systems are surmountable, but their resolution may be costly. The stringent purchase criteria likely to be imposed by many potential system users and the economies of large-scale systems make small systems (less than 10 to 20 kW) less attractive than larger systems. Utilities that consider life-cycle costs in making investment decisions and third-party investors who have tax and financial advantages are likely to place the highest value on solar-power systems.

  19. Assessment of roadside surface water quality of Savar, Dhaka, Bangladesh using GIS and multivariate statistical techniques

    Science.gov (United States)

    Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir

    2017-11-01

    In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.

  20. Statistical approach for assessing the influence of synoptic and meteorological conditions on ozone concentrations over Europe

    Science.gov (United States)

    Otero, Noelia; Butler, Tim; Sillmann, Jana

    2015-04-01

    Air pollution has become a serious problem in many industrialized and densely-populated urban areas due to its negative effects on human health, damages agricultural crops and ecosystems. The concentration of air pollutants is the result of several factors, including emission sources, lifetime and spatial distribution of the pollutants, atmospheric properties and interactions, wind speed and direction, and topographic features. Episodes of air pollution are often associated with stationary or slowly migrating anticyclonic (high-pressure) systems that reduce advection, diffusion, and deposition of atmospheric pollutants. Certain weather conditions facilitate the concentration of pollutants, such as the incidence of light winds that contributes to the increasing of stagnation episodes affecting air quality. Therefore, the atmospheric circulation plays an important role in air quality conditions that are affected by both, synoptic and local scale processes. This study assesses the influence of the large-scale circulation along with meteorological conditions on tropospheric ozone in Europe. The frequency of weather types (WTs) is examined under a novel approach, which is based on an automated version of the Lamb Weather Types catalog (Jenkinson and Collison, 1977). Here, we present an implementation of such classification point-by-point over the European domain. Moreover, the analysis uses a new grid-averaged climatology (1°x1°) of daily surface ozone concentrations from observations of individual sites that matches the resolution of global models (Schnell,et al., 2014). Daily frequency of WTs and meteorological conditions are combined in a multiple regression approach for investigating the influence on ozone concentrations. Different subsets of predictors are examined within multiple linear regression models (MLRs) for each grid cell in order to identify the best regression model. Several statistical metrics are applied for estimating the robustness of the

  1. Assessment of roadside surface water quality of Savar, Dhaka, Bangladesh using GIS and multivariate statistical techniques

    Science.gov (United States)

    Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir

    2017-09-01

    In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.

  2. Area Based Approach for Three Phase Power Quality Assessment in Clarke Plane

    Directory of Open Access Journals (Sweden)

    S. CHATTOPADHYAY

    2008-03-01

    Full Text Available This paper presents an area-based approach for electric power quality analysis. Some specific reference signals have been defined and areas formed by the real power system data with the reference signal have been calculated wherefrom contributions of fundamental waveform and harmonic components have been assessed separately. Active power, reactive power and total harmonic distortion factors have been measured. Clarke transformation technique has been used for analysis in three-phase system, which has reduced the computational effort to a great extent. Distortion factors of individual phase of a three-phase system have also been assessed.

  3. Assessment of offshore wind power potential in the Aegean and Ionian Seas based on high-resolution hindcast model results

    Directory of Open Access Journals (Sweden)

    Takvor Soukissian

    2017-03-01

    Full Text Available In this study long-term wind data obtained from high-resolution hindcast simulations is used to analytically assess offshore wind power potential in the Aegean and Ionian Seas and provide wind climate and wind power potential characteristics at selected locations, where offshore wind farms are at the concept/planning phase. After ensuring the good model performance through detailed validation against buoy measurements, offshore wind speed and wind direction at 10 m above sea level are statistically analyzed on the annual and seasonal time scale. The spatial distribution of the mean wind speed and wind direction are provided in the appropriate time scales, along with the mean annual and the inter-annual variability; these statistical quantities are useful in the offshore wind energy sector as regards the preliminary identification of favorable sites for exploitation of offshore wind energy. Moreover, the offshore wind power potential and its variability are also estimated at 80 m height above sea level. The obtained results reveal that there are specific areas in the central and the eastern Aegean Sea that combine intense annual winds with low variability; the annual offshore wind power potential in these areas reach values close to 900 W/m2, suggesting that a detailed assessment of offshore wind energy would be worth noticing and could lead in attractive investments. Furthermore, as a rough estimate of the availability factor, the equiprobable contours of the event [4 m/s ≤ wind speed ≤ 25 m/s] are also estimated and presented. The selected lower and upper bounds of wind speed correspond to typical cut-in and cut-out wind speed thresholds, respectively, for commercial offshore wind turbines. Finally, for seven offshore wind farms that are at the concept/planning phase the main wind climate and wind power density characteristics are also provided.

  4. 75 FR 61779 - R.E. Ginna Nuclear Power Plant, LLC; R.E. Ginna Nuclear Power Plant Environmental Assessment and...

    Science.gov (United States)

    2010-10-06

    ... COMMISSION R.E. Ginna Nuclear Power Plant, LLC; R.E. Ginna Nuclear Power Plant Environmental Assessment and... Operating License No. DPR-18, issued to R.E. Ginna Nuclear Power Plant, LLC (the licensee), for operation of the R.E. Ginna Nuclear Power Plant (Ginna), located in Ontario, New York. In accordance with 10 CFR 51...

  5. Life cycle assessment of coal-fired power plants and sensitivity analysis of CO2 emissions from power generation side

    Science.gov (United States)

    Yin, Libao; Liao, Yanfen; Zhou, Lianjie; Wang, Zhao; Ma, Xiaoqian

    2017-05-01

    The life cycle assessment and environmental impacts of a 1000MW coal-fired power plant were carried out in this paper. The results showed that the operation energy consumption and pollutant emission of the power plant are the highest in all sub-process, which accounts for 93.93% of the total energy consumption and 92.20% of the total emission. Compared to other pollutant emissions from the coal-fired power plant, CO2 reached up to 99.28%. Therefore, the control of CO2 emission from the coal-fired power plants was very important. Based on the BP neural network, the amount of CO2 emission from the generation side of coal-fired power plants was calculated via carbon balance method. The results showed that unit capacity, coal quality and unit operation load had great influence on the CO2 emission from coal-fired power plants in Guangdong Province. The use of high volatile and high heat value of coal also can reduce the CO2 emissions. What’s more, under higher operation load condition, the CO2 emissions of 1 kWh electric energy was less.

  6. Statistical and dynamical assessment of land-ocean-atmosphere interactions across North Africa

    Science.gov (United States)

    Yu, Yan

    North Africa is highly vulnerable to hydrologic variability and extremes, including impacts of climate change. The current understanding of oceanic versus terrestrial drivers of North African droughts and pluvials is largely model-based, with vast disagreement among models in terms of the simulated oceanic impacts and vegetation feedbacks. Regarding oceanic impacts, the relative importance of the tropical Pacific, tropical Indian, and tropical Atlantic Oceans in regulating the North African rainfall variability, as well as the underlying mechanism, remains debated among different modeling studies. Classic theory of land-atmosphere interactions across the Sahel ecotone, largely based on climate modeling experiments, has promoted positive vegetation-rainfall feedbacks associated with a dominant surface albedo mechanism. However, neither the proposed positive vegetation-rainfall feedback with its underlying albedo mechanism, nor its relative importance compared with oceanic drivers, has been convincingly demonstrated up to now using observational data. Here, the multivariate Generalized Equilibrium Feedback Assessment (GEFA) is applied in order to identify the observed oceanic and terrestrial drivers of North African climate and quantify their impacts. The reliability of the statistical GEFA method is first evaluated against dynamical experiments within the Community Earth System Model (CESM). In order to reduce the sampling error caused by short data records, the traditional GEFA approach is refined through stepwise GEFA, in which unimportant forcings are dropped through stepwise selection. In order to evaluate GEFA's reliability in capturing oceanic impacts, the atmospheric response to a sea-surface temperature (SST) forcing across the tropical Pacific, tropical Indian, and tropical Atlantic Ocean is estimated independently through ensembles of dynamical experiments and compared with GEFA-based assessments. Furthermore, GEFA's performance in capturing terrestrial

  7. Planck 2013 results. XXI. Power spectrum and high-order statistics of the Planck all-sky Compton parameter map

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.

    2014-01-01

    with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales (l l > 500) the clustered cosmic...

  8. Assessment of engine׳s power budget for hydrogen powered hybrid buoyant aircraft

    Directory of Open Access Journals (Sweden)

    Anwar U. Haque

    2016-03-01

    Full Text Available It is well known that hydrogen has less undesirable exhaust emissions as compared with other types of liquid fuels. It can be used as an alternative fuel for a hybrid buoyant aircraft in which half of the gross takeoff weight is balanced by the aerostatic lift. In the present study, weight advantage of liquid hydrogen as an ideal fuel has been explored for its further utilization in such aircraft. Existing relationships for the estimation of zero lift drag of airship is discussed with special focus on the utilization of such analytical relationships for the aircraft whose fuselage resembles with the hull of an airship. Taking the analytical relationship of aircraft and airship design as a reference, existing relationships for estimation of power budget are systematically re-derived for defined constraints of rate of climb, maximum velocity and takeoff ground roll. It is perceived that when the propulsion sizing for liquid hydrogen is required, then the presented framework for estimation of its power budget will provide a starting point for the analysis. An example for estimation of the power requirement is also presented as a test case.

  9. Assessment of solar-powered cooling of buildings. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Curran, H.M.

    1975-04-01

    Three solar-powered cooling concepts are analyzed and evaluated. These are: (1) the solar Rankine concept in which a Rankine cycle driven by solar energy is used to drive a vapor compression refrigeration machine, (2) the solar-assisted Rankine concept in which a Rankine cycle driven by both solar energy and fuel combustion is used to drive a vapor compression refrigeration machine, and (3) the solar absorption concept in which solar energy is used to drive an absorption refrigeration machine. These concepts are compared on the bases of coefficient of performance, requirements for primary fuel input, and economic considerations. Conclusions and recommendations are presented. (WHK)

  10. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  11. 77 FR 39695 - Idaho Power Company; Notice of Availability of Draft Environmental Assessment

    Science.gov (United States)

    2012-07-05

    ...-102 and 2061-086] Idaho Power Company; Notice of Availability of Draft Environmental Assessment In.... 2061) and has prepared a Draft Environmental Assessment. The projects are located on the Snake River in... Monument managed by the National Park Service. The Draft Environmental Assessment contains the Commission...

  12. Expert judgment-based risk assessment using statistical scenario analysis: a case study-running the bulls in Pamplona (Spain).

    Science.gov (United States)

    Mallor, Fermín; García-Olaverri, Carmen; Gómez-Elvira, Sagrario; Mateo-Collazas, Pedro

    2008-08-01

    In this article, we present a methodology to assess the risk incurred by a participant in an activity involving danger of injury. The lack of high-quality historical data for the case considered prevented us from constructing a sufficiently detailed statistical model. It was therefore decided to generate a risk assessment model based on expert judgment. The methodology is illustrated in a real case context: the assessment of risk to participants in a San Fermin bull-run in Pamplona (Spain). The members of the panel of "experts on the bull-run" represented very different perspectives on the phenomenon: runners, surgeons and other health care personnel, journalists, civil defense workers, security staff, organizers, herdsmen, authors of books on the bull-run, etc. We consulted 55 experts. Our methodology includes the design of a survey instrument to elicit the experts' views and the statistical and mathematical procedures used to aggregate their subjective opinions.

  13. Development of a New Safety Culture Assessment Method for Nuclear Power Plants (NPPs) (A study to suggest a new safety culture assessment method in nuclear power plants)

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2014-08-15

    This study is conducted to suggest a new safety culture assessment method in nuclear power plants. Criteria with various existing safety culture analysis methods are united, and reliability analysis methods are applied. The concept of the most representative methods, Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA), are adopted to assess safety culture. Through this application, it is expected that the suggested method will bring results with convenience and objectiveness.

  14. Performance assessment and improvement of control charts for statistical batch process monitoring

    NARCIS (Netherlands)

    Ramaker, Henk-Jan; van Sprang, Eric N. M.; Westerhuis, Johan A.; Gurden, Stephen P.; Smilde, Age K.; van der Meulen, Frank H.

    2006-01-01

    This paper describes the concepts of statistical batch process monitoring and the associated problems. It starts with an introduction to process monitoring in general which is then extended to batch process monitoring. The performance of control charts for batch process monitoring is discussed by

  15. Investigation of properties of the statistics used to assess the reliability

    Directory of Open Access Journals (Sweden)

    С.В. Єгоров

    2007-03-01

    Full Text Available  Modeling by a method of Monte Carlo casual samples from general set is lead with the purpose of definition of probability of non-failure operation. The analysis of quality of statistics of a kind is lead.

  16. Probabilistic risk assessment course documentation. Volume 2. Probability and statistics for PRA applications

    Energy Technology Data Exchange (ETDEWEB)

    Iman, R.L.; Prairie, R.R.; Cramond, W.R.

    1985-08-01

    This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics.

  17. A statistical strategy to assess cleaning level of surfaces using fluorescence spectroscopy and Wilks’ ratio

    DEFF Research Database (Denmark)

    Stoica, Iuliana-Madalina; Babamoradi, Hamid; van den Berg, Frans

    2017-01-01

    •A statistical strategy combining fluorescence spectroscopy, multivariate analysis and Wilks’ ratio is proposed.•The method was tested both off-line and on-line having riboflavin as a (controlled) contaminant.•Wilks’ ratio signals unusual recordings based on shifts in variance and covariance stru...

  18. Assessing the Disconnect between Grade Expectation and Achievement in a Business Statistics Course

    Science.gov (United States)

    Berenson, Mark L.; Ramnarayanan, Renu; Oppenheim, Alan

    2015-01-01

    In an institutional review board--approved study aimed at evaluating differences in learning between a large-sized introductory business statistics course section using courseware assisted examinations compared with small-sized sections using traditional paper-and-pencil examinations, there appeared to be a severe disconnect between the final…

  19. Radiological Assessment for the Removal of Legacy BPA Power Lines that Cross the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Millsap, William J.; Brush, Daniel J.

    2013-11-13

    This paper discusses some radiological field monitoring and assessment methods used to assess the components of an old electrical power transmission line that ran across the Hanford Site between the production reactors area (100 Area) and the chemical processing area (200 Area). This task was complicated by the presence of radon daughters -- both beta and alpha emitters -- residing on the surfaces, particularly on the surfaces of weathered metals and metals that had been electrically-charged. In many cases, these activities were high compared to the DOE Surface Contamination Guidelines, which were used as guides for the assessment. These methods included the use of the Toulmin model of argument, represented using Toulmin diagrams, to represent the combined force of several strands of evidences, rather than a single measurement of activity, to demonstrate beyond a reasonable doubt that no or very little Hanford activity was present and mixed with the natural activity. A number of forms of evidence were used: the overall chance of Hanford contamination; measurements of removable activity, beta and alpha; 1-minute scaler counts of total surface activity, beta and alpha, using "background makers"; the beta activity to alpha activity ratios; measured contamination on nearby components; NaI gamma spectral measurements to compare uncontaminated and potentially-contaminated spectra, as well as measurements for the sentinel radionuclides, Am- 241 and Cs-137 on conducting wire; comparative statistical analyses; and in-situ measurements of alpha spectra on conducting wire showing that the alpha activity was natural Po-210, as well as to compare uncontaminated and potentially-contaminated spectra.

  20. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    Science.gov (United States)

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-09-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective.

  1. Sport-specific power assessment for rock climbing.

    Science.gov (United States)

    Draper, N; Dickson, T; Blackwell, G; Priestley, S; Fryer, S; Marshall, H; Shearman, J; Hamlin, M; Winter, D; Ellis, G

    2011-09-01

    The popularity of rock climbing has resulted in a growing research base for the sport. However, at present there is a lack of sport-specific measures of performance in the field. The aim of this study was to examine the use of the powerslap test as a sport specific power measure. The participants in this study were categorised into four different ability groups (novice, intermediate, advanced and elite) based on self reported lead grade. Two separate experiments were conducted to determine validity and reliability. The powerslap test was conducted on a revolution board with two variations - wide and narrow grip, for both sides of the body. The test started with the climber hanging at full extension from two holds from which a pull up movement was made releasing one hand to slap a scaled score board above. There was a significant relationship between powerslap scores and climbing ability (Left Wide: r=0.7, Pclimbing performance.

  2. 77 FR 51071 - Indiana Michigan Power Company, Donald C. Cook Nuclear Plant, Unit 2, Environmental Assessment...

    Science.gov (United States)

    2012-08-23

    ... COMMISSION Indiana Michigan Power Company, Donald C. Cook Nuclear Plant, Unit 2, Environmental Assessment and... Indiana Michigan Power Company (the licensee), for operation of Donald C. Cook Nuclear Plant, Unit 2 (CNP... exemption, these proposed actions do not result in changes to land use or water use, or result in changes to...

  3. Life Cycle Assessment of a Natural Gas Combined Cycle Power Generation System

    Energy Technology Data Exchange (ETDEWEB)

    Spath, P.L.; Mann, M.K.

    2000-12-27

    Natural gas is used for steam and heat production in industrial processes, residential and commercial heating, and electric power generation. Because of its importance in the power mix, a life cycle assessment on electricity generation via a natural gas combined cycle system has been performed.

  4. 76 FR 26718 - Verdant Power, LLC; Notice of Availability of Environmental Assessment

    Science.gov (United States)

    2011-05-09

    ... Energy Regulatory Commission Verdant Power, LLC; Notice of Availability of Environmental Assessment In... Projects has reviewed Verdant Power, LLC's application for a 10-year pilot project license for the Roosevelt Island Tidal Energy Project (FERC Project No. 12611-005), which would be located on the East River...

  5. 77 FR 1674 - Ocean Renewable Power Company Maine, LLC; Notice of Availability of Environmental Assessment for...

    Science.gov (United States)

    2012-01-11

    ... Renewable Power Company, LLC's application for an 8-year pilot license for the proposed Cobscook Bay Tidal... Energy Regulatory Commission Ocean Renewable Power Company Maine, LLC; Notice of Availability of Environmental Assessment for the Proposed Cobscook Bay Tidal Energy Project In accordance with the National...

  6. 75 FR 14638 - FirstEnergy Nuclear Operating Company; Perry Nuclear Power Plant; Environmental Assessment and...

    Science.gov (United States)

    2010-03-26

    ... COMMISSION FirstEnergy Nuclear Operating Company; Perry Nuclear Power Plant; Environmental Assessment and Finding of No Significant Impact The U.S. Nuclear Regulatory Commission (NRC) is considering issuance of...Energy Nuclear Operating Company (FENOC, the licensee), for operation of the Perry Nuclear Power Plant...

  7. Multi-Level Risk Assessment of a Power Plant Gas Turbine Applying ...

    African Journals Online (AJOL)

    Multi-Level Risk Assessment of a Power Plant Gas Turbine Applying the Criticality Index Model. ... However, Turbine maintenance in Nigerian power generating Plants is unimaginably low; there are incessant plant shut downs, and up to 95% of both foreign and local manufacturers have either shut down production or have ...

  8. 76 FR 52356 - Indiana Michigan Power Company, Donald C. Cook Nuclear Plant, Unit 1; Environmental Assessment...

    Science.gov (United States)

    2011-08-22

    ... COMMISSION Indiana Michigan Power Company, Donald C. Cook Nuclear Plant, Unit 1; Environmental Assessment and... to Indiana Michigan Power Company (the licensee), for operation of Donald C. Cook Nuclear Plant, Unit... Statement for Donald C. Cook Nuclear Plant, Unit 1, or the Generic Environmental Impact Statement for...

  9. A METHOD TO ASSESS THE DEVELOPMENT OF MUSCLE POWER IN PRETERMS AFTER TERM AGE

    NARCIS (Netherlands)

    DEGROOT, L; HOPKINS, B; TOUWEN, BCL

    The purpose of this paper is to report a detailed description of an instrument for evaluating the development of active and passive muscle power in preterms beyond term age. The instrument is constructed on a basis of age-specific items that assess these two components of muscle power and on the

  10. 77 FR 8903 - Environmental Assessment and Finding of No Significant Impact; Carolina Power and Light Company...

    Science.gov (United States)

    2012-02-15

    ... effects on the aquatic or terrestrial habitat in the vicinity of the plant, or to threatened, endangered... COMMISSION Environmental Assessment and Finding of No Significant Impact; Carolina Power and Light Company Shearon Harris Nuclear Power Plant, Unit 1 AGENCY: Nuclear Regulatory Commission. ACTION: Notice of...

  11. Assessment of Power Potential of Tidal Currents and Impacts of Power Extraction on Flow Speeds in Indonesia

    Science.gov (United States)

    Orhan, K.; Mayerle, R.

    2016-12-01

    A methodology comprising of the estimates of power yield, evaluation of the effects of power extraction on flow conditions, and near-field investigations to deliver wake characteritics, recovery and interactions is described and applied to several straits in Indonesia. Site selection is done with high-resolution, three-dimensional flow models providing sufficient spatiotemporal coverage. Much attention has been given to the meteorological forcing, and conditions at the open sea boundaries to adequately capture the density gradients and flow fields. Model verification using tidal records shows excellent agreement. Sites with adequate depth for the energy conversion using horizontal axis tidal turbines, average kinetic power density greater than 0.5 kW/m2, and surface area larger than 0.5km2 are defined as energy hotspots. Spatial variation of the average extractable electric power is determined, and annual tidal energy resource is estimated for the straits in question. The results showed that the potential for tidal power generation in Indonesia is likely to exceed previous predictions reaching around 4,800MW. To assess the impact of the devices, flexible mesh models with higher resolutions have been developed. Effects on flow conditions, and near-field turbine wakes are resolved in greater detail with triangular horizontal grids. The energy is assumed to be removed uniformly by sub-grid scale arrays of turbines, and calculations are made based on velocities at the hub heights of the devices. An additional drag force resulting in dissipation of the pre-existing kinetic power from %10 to %60 within a flow cross-section is introduced to capture the impacts. It was found that the effect of power extraction on water levels and flow speeds in adjacent areas is not significant. Results show the effectivess of the method to capture wake characteritics and recovery reasonably well with low computational cost.

  12. A New Statistical Approach for the Evaluation of Gap-prepulse Inhibition of the Acoustic Startle Reflex (GPIAS for Tinnitus Assessment

    Directory of Open Access Journals (Sweden)

    Achim Schilling

    2017-10-01

    Full Text Available Background: An increasingly used behavioral paradigm for the objective assessment of a possible tinnitus percept in animal models has been proposed by Turner and coworkers in 2006. It is based on gap-prepulse inhibition (PPI of the acoustic startle reflex (ASR and usually referred to as GPIAS. As it does not require conditioning it became the method of choice to study neuroplastic phenomena associated with the development of tinnitus.Objective: It is still controversial if GPIAS is really appropriate for tinnitus screening, as the hypothesis that a tinnitus percept impairs the gap detection ability (“filling-in interpretation” is still questioned. Furthermore, a wide range of criteria for positive tinnitus detection in GPIAS have been used across different laboratories and there still is no consensus on a best practice for statistical evaluation of GPIAS results. Current approaches are often based on simple averaging of measured PPI values and comparisons on a population level without the possibility to perform valid statistics on the level of the single animal.Methods: A total number of 32 animals were measured using the standard GPIAS paradigm with varying number of measurement repetitions. Based on this data further statistical considerations were performed.Results: We here present a new statistical approach to overcome the methodological limitations of GPIAS. In a first step we show that ASR amplitudes are not normally distributed. Next we estimate the distribution of the measured PPI values by exploiting the full combinatorial power of all measured ASR amplitudes. We demonstrate that the amplitude ratios (1-PPI are approximately lognormally distributed, allowing for parametrical testing of the logarithmized values and present a new statistical approach allowing for a valid and reliable statistical assessment of PPI changes in GPIAS.Conclusion: Based on our statistical approach we recommend using a constant criterion, which does not

  13. Assessment of economic factors affecting the satellite power system. Volume 2: The systems implications of rectenna siting issues

    Science.gov (United States)

    Chapman, P. K.; Bugos, B. J.; Csigi, K. I.; Glaser, P. E.; Schimke, G. R.; Thomas, R. G.

    1979-01-01

    The feasibility was evaluated of finding potential sites for Solar Power Satellite (SPS) receiving antennas (rectennas) in the continental United States, in sufficient numbers to permit the SPS to make a major contribution to U.S. generating facilities, and to give statistical validity to an assessment of the characteristics of such sites and their implications for the design of the SPS system. It is found that the cost-optimum power output of the SPS does not depend on the particular value assigned to the cost per unit area of a rectenna and its site, as long as it is independent of rectenna area. Many characteristics of the sites chosen affect the optimum design of the rectenna itself.

  14. Life-assessment technique for nuclear power plant cables

    Energy Technology Data Exchange (ETDEWEB)

    Bartonicek, B.; Hnat, V.; Placek, V

    1998-06-01

    The condition of polymer-based cable material can be best characterized by measuring elongation at break of its insulating materials. However, it is not often possible to take sufficiently large samples for measurement with the tensile testing machine. The problem has been conveniently solved by utilizing differential scanning calorimetry technique. From the tested cable, several microsamples are taken and the oxidation induction time (OIT) is determined. For each cable which is subject to the assessment of the lifetime, the correlation of OIT with elongation at break and the correlation of elongation at break with the cable service time has to be performed. A reliable assessment of the cable lifetime depends on accuracy of these correlations. Consequently, synergistic effects well known at this time - dose rate effects and effects resulting from the different sequence of applying radiation and elevated temperature must be taken into account.

  15. Power System Assessment for the Burnt Mountain Seismic Observatory

    Science.gov (United States)

    1994-03-01

    Generators Appendix B. Operation Manual for Sentinel Radioisotope Thermoelectric Generators Used In AFTAC Seismic Sensor Stations, Burnt Mountain, Alaska...and Maintenance Manual ", Teledyne Energy Systems, Timonium, MD, 1991. US Army Corps of Engineers, Environmental Assessment AFTAC Project Ff1081 Alaska...not result in any apprecia~le radioactivity being dissolved in water. 2. Operacional Accidents a. Terrestrial Appic•tions Several situations can be

  16. Water treatment plants assessment at Talkha power plant.

    Science.gov (United States)

    El-Sebaie, Olfat D; Abd El-Kerim, Ghazy E; Ramadan, Mohamed H; Abd El-Atey, Magda M; Taha, Sahr Ahmed

    2002-01-01

    Talkha power plant is the only power plant located in El-Mansoura. It generates electricity using two different methods by steam turbine and gas turbine. Both plants drew water from River Nile (208 m3 /h). The Nile raw water passes through different treatment processes to be suitable for drinking and operational uses. At Talkha power plant, there are two purification plants used for drinking water supply (100 m3/h) and for water demineralization supply (108 m3/h). This study aimed at studying the efficiency of the water purification plants. For drinking water purification plant, the annual River Nile water characterized by slightly alkaline pH (7.4-8), high annual mean values of turbidity (10.06 NTU), Standard Plate Count (SPC) (313.3 CFU/1 ml), total coliform (2717/100 ml), fecal coliform (0-2400/100 ml), and total algae (3 x 10(4) org/I). The dominant group of algae all over the study period was green algae. The blue green algae was abundant in Summer and Autumn seasons. The pH range, and the annual mean values of turbidity, TDS, total hardness, sulfates, chlorides, nitrates, nitrites, fluoride, and residual chlorine for purified water were in compliance with Egyptian drinking water standards. All the SPC recorded values with an annual mean value of 10.13 CFU/1 ml indicated that chlorine dose and contact time were not enough to kill the bacteria. However, they were in compliance with Egyptian decree (should not exceed 50 CFU/1 ml). Although the removal efficiency of the plant for total coliform and blue green algae was high (98.5% and 99.2%, respectively), the limits of the obtained results with an annual mean values of 40/100 ml and 15.6 org/l were not in compliance with the Egyptian decree (should be free from total coliform, fecal coliform and blue green algae). For water demineralization treatment plant, the raw water was characterized by slightly alkaline pH. The annual mean values of conductivity, turbidity, and TDS were 354.6 microS/cm, 10.84 NTU, and 214

  17. Performance Assessment of the Pico OWC Power Plant Following the Equimar Methodology

    DEFF Research Database (Denmark)

    Pecher, Arthur; Crom, I. Le; Kofoed, Jens Peter

    2011-01-01

    This paper presents the power performance of the Oscillating Water Column (OWC) wave energy converter installed on the Island of Pico. The performance assessment of the device is based on real performance data gathered over the last years during normal operation. In addition to the estimation...... and assessment of the wave energy converting capabilities, an investigation has also been made on the transmission of the wave power through the conversion chain....

  18. High Altitude Electromagnetic Pulse (HEMP) and High Power Microwave (HPM) Devices: Threat Assessments

    Science.gov (United States)

    2008-07-21

    against HEMP effects resulting from a nuclear exchange.40 The Limited Test Ban Treaty of 1963 prohibits nuclear explosions in the atmosphere, in space, and...Order Code RL32544 High Altitude Electromagnetic Pulse ( HEMP ) and High Power Microwave (HPM) Devices: Threat Assessments Updated July 21, 2008 Clay...2008 to 00-00-2008 4. TITLE AND SUBTITLE High Altitude Electromagnetic Pulse ( HEMP ) and High Power Microwave (HPM) Devices: Threat Assessments 5a

  19. Toxicological evaluation of realistic emission source aerosols (TERESA)--power plant studies: assessment of breathing pattern.

    Science.gov (United States)

    Diaz, Edgar A; Lemos, Miriam; Coull, Brent; Long, Mark S; Rohr, Annette C; Ruiz, Pablo; Gupta, Tarun; Kang, Choong-Min; Godleski, John J

    2011-08-01

    Our approach to study multi-pollutant aerosols isolates a single emissions source, evaluates the toxicity of primary and secondary particles derived from this source, and simulates chemical reactions that occur in the atmosphere after emission. Three U.S. coal-fired power plants utilizing different coals and with different emission controls were evaluated. Secondary organic aerosol (SOA) derived from α-pinene and/or ammonia was added in some experiments. Male Sprague-Dawley rats were exposed for 6 h to filtered air or different atmospheric mixtures. Scenarios studied at each plant included the following: primary particles (P); secondary (oxidized) particles (PO); oxidized particles + SOA (POS); and oxidized and neutralized particles + SOA (PONS); additional control scenarios were also studied. Continuous respiratory data were obtained during exposures using whole body plethysmography chambers. Of the 12 respiratory outcomes assessed, each had statistically significant changes at some plant and with some of the 4 scenarios. The most robust outcomes were found with exposure to the PO scenario (increased respiratory frequency with decreases in inspiratory and expiratory time); and the PONS scenario (decreased peak expiratory flow and expiratory flow at 50%). PONS findings were most strongly associated with ammonium, neutralized sulfate, and elemental carbon (EC) in univariate analyses, but only with EC in multivariate analyses. Control scenario O (oxidized without primary particles) had similar changes to PO. Adjusted R(2) analyses showed that scenario was a better predictor of respiratory responses than individual components, suggesting that the complex atmospheric mixture was responsible for respiratory effects.

  20. Power

    DEFF Research Database (Denmark)

    Elmholdt, Claus Westergård; Fogsgaard, Morten

    2016-01-01

    In this chapter, we will explore the dynamics of power in processes of creativity, and show its paradoxical nature as both a bridge and a barrier to creativity in organisations. Recent social psychological experimental research (Slighte, de Dreu & Nijstad, 2011) on the relation between power...... and creativity suggests that when managers give people the opportunity to gain power and explicate that there is reason to be more creative, people will show a boost in creative behaviour. Moreover, this process works best in unstable power hierarchies, which implies that power is treated as a negotiable...... and floating source for empowering people in the organisation. We will explore and discuss here the potentials, challenges and pitfalls of power in relation to creativity in the life of organisations today. The aim is to demonstrate that power struggles may be utilised as constructive sources of creativity...

  1. Final Report: Assessment of Combined Heat and Power Premium Power Applications in California

    Energy Technology Data Exchange (ETDEWEB)

    Norwood, Zack; Lipman, Tim; Marnay, Chris; Kammen, Dan

    2008-09-30

    This report analyzes the current economic and environmental performance of combined heat and power (CHP) systems in power interruption intolerant commercial facilities. Through a series of three case studies, key trade-offs are analyzed with regard to the provision of black-out ridethrough capability with the CHP systems and the resutling ability to avoid the need for at least some diesel backup generator capacity located at the case study sites. Each of the selected sites currently have a CHP or combined heating, cooling, and power (CCHP) system in addition to diesel backup generators. In all cases the CHP/CCHP system have a small fraction of the electrical capacity of the diesel generators. Although none of the selected sites currently have the ability to run the CHP systems as emergency backup power, all could be retrofitted to provide this blackout ride-through capability, and new CHP systems can be installed with this capability. The following three sites/systems were used for this analysis: (1) Sierra Nevada Brewery - Using 1MW of installed Molten Carbonate Fuel Cells operating on a combination of digestor gas (from the beer brewing process) and natural gas, this facility can produce electricty and heat for the brewery and attached bottling plant. The major thermal load on-site is to keep the brewing tanks at appropriate temperatures. (2) NetApp Data Center - Using 1.125 MW of Hess Microgen natural gas fired reciprocating engine-generators, with exhaust gas and jacket water heat recovery attached to over 300 tons of of adsorption chillers, this combined cooling and power system provides electricity and cooling to a data center with a 1,200 kW peak electrical load. (3) Kaiser Permanente Hayward Hospital - With 180kW of Tecogen natural gas fired reciprocating engine-generators this CHP system generates steam for space heating, and hot water for a city hospital. For all sites, similar assumptions are made about the economic and technological constraints of the

  2. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study.

    Directory of Open Access Journals (Sweden)

    Benjamin F Mentiplay

    Full Text Available Hand-held dynamometry (HHD has never previously been used to examine isometric muscle power. Rate of force development (RFD is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power.30 healthy young adults (age: 23±5 yrs, male: 15 were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2 and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC, Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change.Comparison of RFD methods revealed that a peak 200 ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70 for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70 for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31-0.79.Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in

  3. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study.

    Science.gov (United States)

    Mentiplay, Benjamin F; Perraton, Luke G; Bower, Kelly J; Adair, Brooke; Pua, Yong-Hao; Williams, Gavin P; McGaw, Rebekah; Clark, Ross A

    2015-01-01

    Hand-held dynamometry (HHD) has never previously been used to examine isometric muscle power. Rate of force development (RFD) is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power. 30 healthy young adults (age: 23±5 yrs, male: 15) were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2) and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC), Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change. Comparison of RFD methods revealed that a peak 200 ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70) for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70) for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31-0.79). Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in clinical

  4. Demonstration of Recessed Downlight Technologies: Power and Illumination Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Steven A.; Beeson, Tracy A.

    2009-11-20

    Solid state lighting (SSL), specifically light-emitting diodes (LED), has been advancing at a rapid pace, and there are presently multiple products available that serve as direct replacements for traditional luminaires. In this demonstration, conventional recessed lights in a conference room were used to compare conventional incandescent A-lamps, incandescent reflector R-lamps, dimming compact fluorescent lamps (CFL), to an LED replacement product. The primary focus during the study was on light delivered to the task plane as provided by the power required by the lighting system. Vertical illuminance, dimming range, and color shift are also important indicators of lighting quality and are discussed in the report. The results clearly showed that LEDs, with dimming-capable drivers, are much more efficient than incandescent and CFLs. Further, LEDs provide much smoother and consistent dimming than dimmable CFLs. On the potential negative side, it is important that the dimming switch be identified as compatible with the LED driver. A wide variety of dimmer switches are capable of dimming LEDs down to 15% of full light output, while select others can be capable of dimming LEDs down to 5%. In addition, LEDs can be intensive light sources, which can result in uncomfortable glare in some applications and to some occupants. Higher ceiling (9-foot or greater) or non-specular reflectors can act to alleviate the potential for glare.

  5. Thermal Enhancement of Silicon Carbide (SiC) Power Electronics and Laser Bars: Statistical Design Optimization of a Liquid-Cooled Power Electronic Heat Sink

    Science.gov (United States)

    2015-08-01

    In- house 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62203F 6. AUTHOR(S) James D. Scofield 5d. PROJECT NUMBER 3145 5e. TASK NUMBER N/A...management components. As an AFRL In House researcher responsible for conducting research in the area of power electronics, components, and...Thermal Management”, Proceedings 46th AIAA Aerospace Sciences Meeting and Exhibit, January 2008, Reno NV. 6. Sparrow , E., and Larson E., “Heat Transfer

  6. Assessing statistical views of natural selection: Room for non-local causation?

    Science.gov (United States)

    Huneman, Philippe

    2013-12-01

    Recently some philosophers (the "statisticalists") have emphasized a potentially irreconcilable conceptual antagonism between the statistical characterization of natural selection (derived from population genetics) and the standard scientific discussion of natural selection in terms of forces and causes. Other philosophers have developed an account of the causal character of selectionist statements represented in terms of counterfactuals. I examine the compatibility between such statisticalism and counterfactually based causal accounts of natural selection (and related arguments about counterfactuals and causality) by distinguishing two distinct statisticalist claims: firstly the suggested impossibility for natural selection to be a cause acting upon populations and secondly the conceptualization that all evolutionary causes occur at the level of interactions between individual organisms. I argue that deriving the latter from the former involves supplementary assumptions concerning precisely what causation is. I critically examine two of these assumptions purportedly preventing natural selection being regarded as a cause: the locality claim and the modularity claim. I conclude that justifying the strongest version of statisticalism-i.e. evolutionary causation only occurs at the level of individual interactions between organisms-would require further metaphysical arguments that are likely to be deemed highly problematic. Additionally, I argue that such a metaphysical position would be considered incongruous with both our scientific and ordinary use of the concepts of causality and explanation as employed within our everyday epistemological framework. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.

  8. a statistical model for stock assessment of southern bluefin tuna with ...

    African Journals Online (AJOL)

    Assessment of the status of southern bluefin tuna (SBT) by Australia and Japan has used a method (ADAPT) that imposes a number of structural restrictions, and is similar to methods used for a number of stocks worldwide. A flexible method for assessment of the SBT population is presented that is much less restrictive and ...

  9. New adaptive statistical iterative reconstruction ASiR-V: Assessment of noise performance in comparison to ASiR.

    Science.gov (United States)

    De Marco, Paolo; Origgi, Daniela

    2018-01-24

    To assess the noise characteristics of the new adaptive statistical iterative reconstruction (ASiR-V) in comparison to ASiR. A water phantom was acquired with common clinical scanning parameters, at five different levels of CTDIvol . Images were reconstructed with different kernels (STD, SOFT, and BONE), different IR levels (40%, 60%, and 100%) and different slice thickness (ST) (0.625 and 2.5 mm), both for ASiR-V and ASiR. Noise properties were investigated and noise power spectrum (NPS) was evaluated. ASiR-V significantly reduced noise relative to FBP: noise reduction was in the range 23%-60% for a 0.625 mm ST and 12%-64% for the 2.5 mm ST. Above 2 mGy, noise reduction for ASiR-V had no dependence on dose. Noise reduction for ASIR-V has dependence on ST, being greater for STD and SOFT kernels at 2.5 mm. For the STD kernel ASiR-V has greater noise reduction for both ST, if compared to ASiR. For the SOFT kernel, results varies according to dose and ST, while for BONE kernel ASIR-V shows less noise reduction. NPS for CT Revolution has dose dependent behavior at lower doses. NPS for ASIR-V and ASiR is similar, showing a shift toward lower frequencies as the IR level increases for STD and SOFT kernels. The NPS is different between ASiR-V and ASIR with BONE kernel. NPS for ASiR-V appears to be ST dependent, having a shift toward lower frequencies for 2.5 mm ST. ASiR-V showed greater noise reduction than ASiR for STD and SOFT kernels, while keeping the same NPS. For the BONE kernel, ASiR-V presents a completely different behavior, with less noise reduction and modified NPS. Noise properties of the ASiR-V are dependent on reconstruction slice thickness. The noise properties of ASiR-V suggest the need for further measurements and efforts to establish new CT protocols to optimize clinical imaging. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. Wind farm efficiency assessed by WRF with a statistical-dynamical approach

    DEFF Research Database (Denmark)

    Volker, Patrick; Badger, Jake; Hahmann, Andrea N.

    2016-01-01

    the power production of a target wind farm and (II) how large wind farms can get if they are to remain efficient and productive power generators. The modelling of wind farm wake flows is challenging, since it includes processes from the micro- to mesoscale meteorology. We use the Weather Research...... and Forecast (WRF) model that allows us to simulate mesoscale features of wind farm wakes. Its limited horizontal resolution – in microscale terms – however, requires flow characteristics, such as single turbine wakes, to be parametrised.......A pledge to increase the share of renewable energies has led to a focus on offshore wind energy in many western European countries. With an increasing number of offshore wind farms to be installed it becomes important to understand (I) the degree to which wakes from neighbouring wind farms affect...

  11. AN ASSESSMENT OF FLYWHEEL HIGH POWER ENERGY STORAGE TECHNOLOGY FOR HYBRID VEHICLES

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, James Gerald [ORNL

    2012-02-01

    An assessment has been conducted for the DOE Vehicle Technologies Program to determine the state of the art of advanced flywheel high power energy storage systems to meet hybrid vehicle needs for high power energy storage and energy/power management. Flywheel systems can be implemented with either an electrical or a mechanical powertrain. The assessment elaborates upon flywheel rotor design issues of stress, materials and aspect ratio. Twelve organizations that produce flywheel systems submitted specifications for flywheel energy storage systems to meet minimum energy and power requirements for both light-duty and heavy-duty hybrid applications of interest to DOE. The most extensive experience operating flywheel high power energy storage systems in heavy-duty and light-duty hybrid vehicles is in Europe. Recent advances in Europe in a number of vehicle racing venues and also in road car advanced evaluations are discussed. As a frame of reference, nominal weight and specific power for non-energy storage components of Toyota hybrid electric vehicles are summarized. The most effective utilization of flywheels is in providing high power while providing just enough energy storage to accomplish the power assist mission effectively. Flywheels are shown to meet or exceed the USABC power related goals (discharge power, regenerative power, specific power, power density, weight and volume) for HEV and EV batteries and ultracapacitors. The greatest technical challenge facing the developer of vehicular flywheel systems remains the issue of safety and containment. Flywheel safety issues must be addressed during the design and testing phases to ensure that production flywheel systems can be operated with adequately low risk.

  12. Gasification/combined-cycle power generation: environmental assessment of alternative systems

    Energy Technology Data Exchange (ETDEWEB)

    1978-11-01

    This report provides a basis for the comparative assessment of the potential performance capability, technological development, and economic and environmental impact associated with the operation of integrated low-Btu coal-gasification/combined-cycle power systems. Characterization of the integrated power system in terms of fuel processing, power production, and auxiliary systems is followed up with comparisons of alternative integrated-plant-design/fuel combinations with reference to the conventional coal-fired power plant, taking into account both economic and environmental factors. The report includes an assessment of the effects of recent regulatory changes on the prospects for integrated power systems and establishes a timetable for the probable commercial development of such systems by the utilities.

  13. Objective assessment of the effect of pupil size upon the power distribution of multifocal contact lenses.

    Science.gov (United States)

    Papadatou, Eleni; Del Águila-Carrasco, Antonio J; Esteve-Taboada, José J; Madrid-Costa, David; Cerviño-Expósito, Alejandro

    2017-01-01

    To analytically assess the effect of pupil size upon the refractive power distributions of different designs of multifocal contact lenses. Two multifocal contact lenses of center-near design and one multifocal contact lens of center-distance design were used in this study. Their power profiles were measured using the NIMO TR1504 device (LAMBDA-X, Belgium). Based on their power profiles, the power distribution was assessed as a function of pupil size. For the high addition lenses, the resulting refractive power as a function of viewing distance (far, intermediate, and near) and pupil size was also analyzed. The power distribution of the lenses was affected by pupil size differently. One of the lenses showed a significant spread in refractive power distribution, from about -3 D to 0 D. Generally, the power distribution of the lenses expanded as the pupil diameter became greater. The surface of the lens dedicated for each distance varied substantially with the design of the lens. In an experimental basis, our results show how the lenses power distribution is affected by the pupil size and underlined the necessity of careful evaluation of the patient's visual needs and the optical properties of a multifocal contact lens for achieving the optimal visual outcome.

  14. Power

    OpenAIRE

    Hafford-Letchfield, Trish

    2015-01-01

    This chapter looks at the concept of power in social work by focusing on what this means as a ‘professional’ and theorizes competing discourses of empowerment in social work and its key concepts, drawing in particular on the explanatory powers of critical theorist Michel Foucault (1991). The chapter problematizes the concept of power by explicitly drawing on both users’ and carers’ accounts from the literature to demonstrate different external and internal influences on the root causes of dis...

  15. From eyeballing to statistical modelling : methods for assessment of occupational exposure

    NARCIS (Netherlands)

    Kromhout, H.

    1994-01-01

    In this thesis methods for assessment of occupational exposure are evaluated and developed. These methods range from subjective methods (qualitative and semiquantitative) to more objective quantitative methods based on actual measurement of personal exposure to chemical and physical

  16. Individual cortical connectivity changes after stroke: a resampling approach to enable statistical assessment at single-subject level.

    Science.gov (United States)

    Petti, M; Pichiorri, F; Toppi, J; Cincotti, F; Salinari, S; Babiloni, F; Mattia, D; Astolfi, L

    2014-01-01

    One of the main limitations commonly encountered when dealing with the estimation of brain connectivity is the difficulty to perform a statistical assessment of significant changes in brain networks at a single-subject level. This is mainly due to the lack of information about the distribution of the connectivity estimators at different conditions. While group analysis is commonly adopted to perform a statistical comparison between conditions, it may impose major limitations when dealing with the heterogeneity expressed by a given clinical condition in patients. This holds true particularly for stroke when seeking for quantitative measurements of the efficacy of any rehabilitative intervention promoting recovery of function. The need is then evident of an assessment which may account for individual pathological network configuration associated with different level of patients' response to treatment; such network configuration is highly related to the effect that a given brain lesion has on neural networks. In this study we propose a resampling-based approach to the assessment of statistically significant changes in cortical connectivity networks at a single subject level. First, we provide the results of a simulation study testing the performances of the proposed approach under different conditions. Then, to show the sensitivity of the method, we describe its application to electroencephalographic (EEG) data recorded from two post-stroke patients who showed different clinical recovery after a rehabilitative intervention.

  17. The Risk Assessment Study for Electric Power Marketing Competitiveness Based on Cloud Model and TOPSIS

    Science.gov (United States)

    Li, Cunbin; Wang, Yi; Lin, Shuaishuai

    2017-09-01

    With the rapid development of the energy internet and the deepening of the electric power reform, the traditional marketing mode of electric power does not apply to most of electric power enterprises, so must seek a breakthrough, however, in the face of increasingly complex marketing information, how to make a quick, reasonable transformation, makes the electric power marketing competitiveness assessment more accurate and objective becomes a big problem. In this paper, cloud model and TOPSIS method is proposed. Firstly, build the electric power marketing competitiveness evaluation index system. Then utilize the cloud model to transform the qualitative evaluation of the marketing data into quantitative values and use the entropy weight method to weaken the subjective factors of evaluation index weight. Finally, by TOPSIS method the closeness degrees of alternatives are obtained. This method provides a novel solution for the electric power marketing competitiveness evaluation. Through the case analysis the effectiveness and feasibility of this model are verified.

  18. Assessment of the Economic Potential of Microgrids for Reactive Power Supply

    Energy Technology Data Exchange (ETDEWEB)

    Appen, Jan von; Marnay, Chris; Stadler, Michael; Momber, Ilan; Klapp, David; Scheven, Alexander von

    2011-05-01

    As power generation from variable distributed energy resources (DER) grows, energy flows in the network are changing, increasing the requirements for ancillary services, including voltage support. With the appropriate power converter, DER can provide ancillary services such as frequency control and voltage support. This paper outlines the economic potential of DERs coordinated in a microgrid to provide reactive power and voltage support at its point of common coupling. The DER Customer Adoption Model assesses the costs of providing reactive power, given local utility rules. Depending on the installed DER, the cost minimizing solution for supplying reactive power locally is chosen. Costs include the variable cost of the additional losses and the investment cost of appropriately over-sizing converters or purchasing capacitors. A case study of a large health care building in San Francisco is used to evaluate different revenue possibilities of creating an incentive for microgrids to provide reactive power.

  19. A statistical toolbox for metagenomics: assessing functional diversity in microbial communities

    Directory of Open Access Journals (Sweden)

    Handelsman Jo

    2008-01-01

    Full Text Available Abstract Background The 99% of bacteria in the environment that are recalcitrant to culturing have spurred the development of metagenomics, a culture-independent approach to sample and characterize microbial genomes. Massive datasets of metagenomic sequences have been accumulated, but analysis of these sequences has focused primarily on the descriptive comparison of the relative abundance of proteins that belong to specific functional categories. More robust statistical methods are needed to make inferences from metagenomic data. In this study, we developed and applied a suite of tools to describe and compare the richness, membership, and structure of microbial communities using peptide fragment sequences extracted from metagenomic sequence data. Results Application of these tools to acid mine drainage, soil, and whale fall metagenomic sequence collections revealed groups of peptide fragments with a relatively high abundance and no known function. When combined with analysis of 16S rRNA gene fragments from the same communities these tools enabled us to demonstrate that although there was no overlap in the types of 16S rRNA gene sequence observed, there was a core collection of operational protein families that was shared among the three environments. Conclusion The results of comparisons between the three habitats were surprising considering the relatively low overlap of membership and the distinctively different characteristics of the three habitats. These tools will facilitate the use of metagenomics to pursue statistically sound genome-based ecological analyses.

  20. Performance statistics of wind power systems in Germany. September through December 2004; Leistungsstatistik der Windkraftanlagen in Deutschland. September, Oktober, November und Dezember 2004

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2005-07-01

    Statistics are presented for 4,472 plants with a total capacity of 4,405 MW, based on the data provided by producers and operators. This means that the statistics is necessarily incomplete, containing only about one in three wind power systems in Germany. Still, second only to 'Windstats' it is the world's largest data compilation on wind power system performance. As of 31 December 2004, there were a total of 16,453 wind power systems in Germany with a total capacity of 16,451.09 MW. (orig.) [German] In der folgenden Statistik werden die Leistungen der Monate September, Oktober, November und Dezember 2004 vorgestellt. Es wurden fuer 4.472 Anlagen mit einer Gesamtleistung von 4.405 MW von Herstellern und Betreibern die Leistungsdaten gemeldet. Es sind nur die Anlagen in der folgenden Statistik vorgestellt, die ihre monatlichen Ertraege an die Ingenieur-Werkstatt Energietechnik melden. Die Anlagen ohne Leistungsmeldung sind nicht abgedruckt. Diese Statistik ist also unvollstaendig, nur etwa jede dritte Windkraftanlage in Deutschland ist erfasst. Es ist aber nach 'Windstats' die umfangreichste Datensammlung ueber die Leistung von Windkraftanlagen, die weltweit besteht. In der Bundesrepublik arbeiten mit Stand 31.12.2004 insgesamt 16.453 Anlagen mit 16.451,09 MW Leistung. (orig.)