WorldWideScience

Sample records for assessment statistical power

  1. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    OpenAIRE

    Abul Kalam Azad; Mohammad Golam Rasul; Talal Yusaf

    2014-01-01

    The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM), method of moments (MOM), standard deviation method (STDM), maximum likelihood method (MLM), power density method (PDM), modified maximum likelihood method (MMLM) and equivalent energy method (EEM) were used to estimate the Weibull parameters and six statistical tools, name...

  2. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Directory of Open Access Journals (Sweden)

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  3. Distance matters. Assessing socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic: Local perceptions and statistical evidence

    Directory of Open Access Journals (Sweden)

    Frantál Bohumil

    2016-03-01

    Full Text Available The effect of geographical distance on the extent of socioeconomic impacts of the Dukovany nuclear power plant in the Czech Republic is assessed by combining two different research approaches. First, we survey how people living in municipalities in the vicinity of the power plant perceive impacts on their personal quality of life. Second, we explore the effects of the power plant on regional development by analysing long-term statistical data about the unemployment rate, the share of workers in the energy sector and overall job opportunities in the respective municipalities. The results indicate that the power plant has had significant positive impacts on surrounding communities both as perceived by residents and as evidenced by the statistical data. The level of impacts is, however, significantly influenced by the spatial and social distances of communities and individuals from the power plant. The perception of positive impacts correlates with geographical proximity to the power plant, while the hypothetical distance where positive effects on the quality of life are no longer perceived was estimated at about 15 km. Positive effects are also more likely to be reported by highly educated, young and middle-aged and economically active persons, whose work is connected to the power plant.

  4. DISTRIBUTED GRID-CONNECTED PHOTOVOLTAIC POWER SYSTEM EMISSION OFFSET ASSESSMENT: STATISTICAL TEST OF SIMULATED- AND MEASURED-BASED DATA

    Science.gov (United States)

    This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...

  5. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  6. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation stu

  7. Power and environmental assessment

    DEFF Research Database (Denmark)

    Cashmore, Matthew Asa; Richardson, Tim

    2013-01-01

    The significance of politics and power dynamics has long been recognised in environmental assessment (EA) research, but there has not been sustained attention to power, either theoretically or empirically. The aim of this special issue is to encourage the EA community to engage more consistently...... with the issue of power. The introduction represents a ground-clearing exercise intended to clarify the terms of the debate about power in the EA field, and to contribute to the development of a research agenda. Research trends in the field are outlined, and potential analytic and normative lines of inquiry...... are identified. The contributions to this special issue represent contrasting conceptual and methodological approaches that navigate the analytical and normative terrain of power dynamics in EA. Together, they demonstrate that power cannot be removed from EA policy or practices, and is a necessary research focus...

  8. Potential for accidents in a nuclear power plant: probabilistic risk assessment, applied statistical decision theory, and implications of such considerations to mathematics education

    Energy Technology Data Exchange (ETDEWEB)

    Dios, R.A.

    1984-01-01

    This dissertation focuses upon the field of probabilistic risk assessment and its development. It investigates the development of probabilistic risk assessment in nuclear engineering. To provide background for its development, the related areas of population dynamics (demography), epidemiology and actuarial science are studied by presenting information upon how risk has been viewed in these areas over the years. A second major problem involves presenting an overview of the mathematical models related to risk analysis to mathematics educators and making recommendations for presenting this theory in classes of probability and statistics for mathematics and engineering majors at the undergraduate and graduate levels.

  9. Evaluating and Reporting Statistical Power in Counseling Research

    Science.gov (United States)

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  10. Statistical assessment of biosimilar products.

    Science.gov (United States)

    Chow, Shein-Chung; Liu, Jen-Pei

    2010-01-01

    Biological products or medicines are therapeutic agents that are produced using a living system or organism. Access to these life-saving biological products is limited because of their expensive costs. Patents on the early biological products will soon expire in the next few years. This allows other biopharmaceutical/biotech companies to manufacture the generic versions of the biological products, which are referred to as follow-on biological products by the U.S. Food and Drug Administration (FDA) or as biosimilar medicinal products by the European Medicine Agency (EMEA) of the European Union (EU). Competition of cost-effective follow-on biological products with equivalent efficacy and safety can cut down the costs and hence increase patients' access to the much-needed biological pharmaceuticals. Unlike for the conventional pharmaceuticals of small molecules, the complexity and heterogeneity of the molecular structure, complicated manufacturing process, different analytical methods, and possibility of severe immunogenicity reactions make evaluation of equivalence (similarity) between the biosimilar products and their corresponding innovator product a great challenge for both the scientific community and regulatory agencies. In this paper, we provide an overview of the current regulatory requirements for approval of biosimilar products. A review of current criteria for evaluation of bioequivalence for the traditional chemical generic products is provided. A detailed description of the differences between the biosimilar and chemical generic products is given with respect to size and structure, immunogenicity, product quality attributed, and manufacturing processes. In addition, statistical considerations including design criteria, fundamental biosimilar assumptions, and statistical methods are proposed. The possibility of using genomic data in evaluation of biosimilar products is also explored.

  11. Statistical aspects of fish stock assessment

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte

    Fish stock assessments are conducted for two main purposes: 1) To estimate past and present fish abundances and their commercial exploitation rates. 2) To predict the consequences of different management strategies in order to ensure a sustainable fishery in the future. This thesis concerns...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods...... for stock assessment by application of state-of-the-art statistical methodology. The main contributions are presented in the form of six research papers. The major part of the thesis deals with age-structured assessment models, which is the most common approach. Conversion from length to age distributions...

  12. Assessing statistical significance in causal graphs

    Directory of Open Access Journals (Sweden)

    Chindelevitch Leonid

    2012-02-01

    Full Text Available Abstract Background Causal graphs are an increasingly popular tool for the analysis of biological datasets. In particular, signed causal graphs--directed graphs whose edges additionally have a sign denoting upregulation or downregulation--can be used to model regulatory networks within a cell. Such models allow prediction of downstream effects of regulation of biological entities; conversely, they also enable inference of causative agents behind observed expression changes. However, due to their complex nature, signed causal graph models present special challenges with respect to assessing statistical significance. In this paper we frame and solve two fundamental computational problems that arise in practice when computing appropriate null distributions for hypothesis testing. Results First, we show how to compute a p-value for agreement between observed and model-predicted classifications of gene transcripts as upregulated, downregulated, or neither. Specifically, how likely are the classifications to agree to the same extent under the null distribution of the observed classification being randomized? This problem, which we call "Ternary Dot Product Distribution" owing to its mathematical form, can be viewed as a generalization of Fisher's exact test to ternary variables. We present two computationally efficient algorithms for computing the Ternary Dot Product Distribution and investigate its combinatorial structure analytically and numerically to establish computational complexity bounds. Second, we develop an algorithm for efficiently performing random sampling of causal graphs. This enables p-value computation under a different, equally important null distribution obtained by randomizing the graph topology but keeping fixed its basic structure: connectedness and the positive and negative in- and out-degrees of each vertex. We provide an algorithm for sampling a graph from this distribution uniformly at random. We also highlight theoretical

  13. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    Science.gov (United States)

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…

  14. Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models

    Science.gov (United States)

    Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles

    2012-01-01

    This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…

  15. Editor’s note: The uncorrupted statistical power

    Directory of Open Access Journals (Sweden)

    Jean Descôteaux

    2007-09-01

    Full Text Available In 1999, Wilkinson and the Task Force on Statistical Inference published a number of recommendations concerning testing – related issues including, most importantly, statistical power. These recommendations are discussed prior to the presentation of the structure and the various articles of this special issue on statistical power. The contents of these articles will most certainly prove quite useful to those wishing to follow the Task Force’s recommendations.

  16. When Mathematics and Statistics Collide in Assessment Tasks

    Science.gov (United States)

    Bargagliotti, Anna; Groth, Randall

    2016-01-01

    Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.

  17. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials...... as powder blends there is no natural unit or amount to define a sample from the blend, and partly that current technology does not provide a method of universally collecting small representative samples from large static powder beds. In the thesis a number of methods to assess (in)homogeneity are presented....... Some methods have a focus on exploratory analysis where the aim is to investigate the spatial distribution of drug content in the batch. Other methods presented focus on describing the overall (total) (in)homogeneity of the blend. The overall (in)homogeneity of the blend is relevant as it is closely...

  18. Statistical Analysis of Loss of Offsite Power Events

    Directory of Open Access Journals (Sweden)

    Andrija Volkanovski

    2016-01-01

    Full Text Available This paper presents the results of the statistical analysis of the loss of offsite power events (LOOP registered in four reviewed databases. The reviewed databases include the IRSN (Institut de Radioprotection et de Sûreté Nucléaire SAPIDE database and the GRS (Gesellschaft für Anlagen- und Reaktorsicherheit mbH VERA database reviewed over the period from 1992 to 2011. The US NRC (Nuclear Regulatory Commission Licensee Event Reports (LERs database and the IAEA International Reporting System (IRS database were screened for relevant events registered over the period from 1990 to 2013. The number of LOOP events in each year in the analysed period and mode of operation are assessed during the screening. The LOOP frequencies obtained for the French and German nuclear power plants (NPPs during critical operation are of the same order of magnitude with the plant related events as a dominant contributor. A frequency of one LOOP event per shutdown year is obtained for German NPPs in shutdown mode of operation. For the US NPPs, the obtained LOOP frequency for critical and shutdown mode is comparable to the one assessed in NUREG/CR-6890. Decreasing trend is obtained for the LOOP events registered in three databases (IRSN, GRS, and NRC.

  19. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  20. Wind Power Error Estimation in Resource Assessments

    Science.gov (United States)

    Rodríguez, Osvaldo; del Río, Jesús A.; Jaramillo, Oscar A.; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies. PMID:26000444

  1. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  2. Power Plant Water Intake Assessment.

    Science.gov (United States)

    Zeitoun, Ibrahim H.; And Others

    1980-01-01

    In order to adequately assess the impact of power plant cooling water intake on an aquatic ecosystem, total ecosystem effects must be considered, rather than merely numbers of impinged or entrained organisms. (Author/RE)

  3. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  4. Study on the statistical characteristics of solar power

    Science.gov (United States)

    Liu, Jun

    2017-01-01

    Solar power in China has grown rapidly recently. It exists variation of solar power due to cloudy and dusty, which is not that much of wind. The way to evaluate the statistical characteristics of solar power is important to the analysis of power system planning and operation. In this study, a multi-scale spatial and temporal framework of evaluating indices was established to describe the variation of its own natural features and the interaction between solar and load, grids. Finally, we have a case study on the variation, comparison, penetration, etc.

  5. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  6. "Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"

    Science.gov (United States)

    Konstantopoulos, Spyros

    2009-01-01

    Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…

  7. Robust Statistical Detection of Power-Law Cross-Correlation

    Science.gov (United States)

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-06-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  8. Power laws statistics of cliff failures, scaling and percolation

    CERN Document Server

    Baldassarri, Andrea

    2014-01-01

    The size of large cliff failures may be described in several ways, for instance considering the horizontal eroded area at the cliff top and the maximum local retreat of the coastline. Field studies suggest that, for large failures, the frequencies of these two quantities decrease as power laws of the respective magnitudes, defining two different decay exponents. Moreover, the horizontal area increases as a power law of the maximum local retreat, identifying a third exponent. Such observation suggests that the geometry of cliff failures are statistically similar for different magnitudes. Power laws are familiar in the physics of critical systems. The corresponding exponents satisfy precise relations and are proven to be universal features, common to very different systems. Following the approach typical of statistical physics, we propose a "scaling hypothesis" resulting in a relation between the three above exponents: there is a precise, mathematical relation between the distributions of magnitudes of erosion ...

  9. Using Tree Diagrams as an Assessment Tool in Statistics Education

    Science.gov (United States)

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  10. Statistical analyses support power law distributions found in neuronal avalanches.

    Science.gov (United States)

    Klaus, Andreas; Yu, Shan; Plenz, Dietmar

    2011-01-01

    The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

  11. Statistical analyses support power law distributions found in neuronal avalanches.

    Directory of Open Access Journals (Sweden)

    Andreas Klaus

    Full Text Available The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii model parameter estimation to determine the specific exponent of the power law, and (iii comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect. This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

  12. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  13. Statistical analysis of cascading failures in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Laboratory; Pfitzner, Rene [Los Alamos National Laboratory; Turitsyn, Konstantin [Los Alamos National Laboratory

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.

  14. Fractional-power-law level statistics due to dynamical tunneling.

    Science.gov (United States)

    Bäcker, Arnd; Ketzmerick, Roland; Löck, Steffen; Mertig, Normann

    2011-01-14

    For systems with a mixed phase space we demonstrate that dynamical tunneling universally leads to a fractional power law of the level-spacing distribution P(s) over a wide range of small spacings s. Going beyond Berry-Robnik statistics, we take into account that dynamical tunneling rates between the regular and the chaotic region vary over many orders of magnitude. This results in a prediction of P(s) which excellently describes the spectral data of the standard map. Moreover, we show that the power-law exponent is proportional to the effective Planck constant h(eff).

  15. Development of PowerMap: a software package for statistical power calculation in neuroimaging studies.

    Science.gov (United States)

    Joyce, Karen E; Hayasaka, Satoru

    2012-10-01

    Although there are a number of statistical software tools for voxel-based massively univariate analysis of neuroimaging data, such as fMRI (functional MRI), PET (positron emission tomography), and VBM (voxel-based morphometry), very few software tools exist for power and sample size calculation for neuroimaging studies. Unlike typical biomedical studies, outcomes from neuroimaging studies are 3D images of correlated voxels, requiring a correction for massive multiple comparisons. Thus, a specialized power calculation tool is needed for planning neuroimaging studies. To facilitate this process, we developed a software tool specifically designed for neuroimaging data. The software tool, called PowerMap, implements theoretical power calculation algorithms based on non-central random field theory. It can also calculate power for statistical analyses with FDR (false discovery rate) corrections. This GUI (graphical user interface)-based tool enables neuroimaging researchers without advanced knowledge in imaging statistics to calculate power and sample size in the form of 3D images. In this paper, we provide an overview of the statistical framework behind the PowerMap tool. Three worked examples are also provided, a regression analysis, an ANOVA (analysis of variance), and a two-sample T-test, in order to demonstrate the study planning process with PowerMap. We envision that PowerMap will be a great aide for future neuroimaging research.

  16. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  17. Statistical Classification of Cascading Failures in Power Grids

    CERN Document Server

    Pfitzner, René; Chertkov, Michael

    2010-01-01

    We introduce a new microscopic model of the outages in transmission power grids. This model accounts for the automatic response of the grid to load fluctuations that take place on the scale of minutes, when the optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, initiated by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascadi...

  18. Statistics of power input into a vibrated granular system

    Science.gov (United States)

    Wang, Hongqiang; Feitosa, Klebert; Menon, Narayanan

    2004-03-01

    Statistics of power input into a vibrated granular system Authors: Hongqiang Wang, Klebert Feitosa, Narayanan Menon Motivated by the recent Fluctuation theorem of Gallavotti and Cohen, we demonstrate a numerical and experimental exploration of the fluctuations in power input and energy dissipation in a sub-volume of a vibrated granular system. Both experimental and simulation results are in accord with the Fluctuation relation, even for short-time fluctuations. In the simulations, we are also able to compare power fluctuations in rotational and translational modes; we discuss the effective temperatures arising from this fluctuation relation. Finally, in the simulations, we also study the dependence of our results on the size of the sub-volume considered in the system. Supported by: NSF DMR 9878433, DMR 0216719

  19. Statistics of the Sunyaev-Zel'dovich Effect power spectrum

    CERN Document Server

    Peel, Michael W; Kay, Scott T

    2009-01-01

    Using large numbers of simulations of the microwave sky, incorporating the Cosmic Microwave Background (CMB) and the Sunyaev-Zel'dovich (SZ) effect due to clusters, we investigate the statistics of the power spectrum at microwave frequencies between spherical multipoles of 1000 and 10000. From these virtual sky maps, we find that the spectrum of the SZ effect has a larger standard deviation by a factor of 3 than would be expected from purely Gaussian realizations, and has a distribution that is significantly skewed towards higher values, especially when small map sizes are used. The standard deviation is also increased by around 10 percent compared to the trispectrum calculation due to the clustering of galaxy clusters. We also consider the effects of including residual point sources and uncertainties in the gas physics. This has implications for the excess power measured in the CMB power spectrum by the Cosmic Background Imager and BIMA experiments. Our results indicate that the observed excess could be expl...

  20. Teaching, Learning and Assessing Statistical Problem Solving

    Science.gov (United States)

    Marriott, John; Davies, Neville; Gibson, Liz

    2009-01-01

    In this paper we report the results from a major UK government-funded project, started in 2005, to review statistics and handling data within the school mathematics curriculum for students up to age 16. As a result of a survey of teachers we developed new teaching materials that explicitly use a problem-solving approach for the teaching and…

  1. HVDC power transmission technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, R.L.; Tatro, P.J.; Railing, B.D. [New England Power Service Co., Westborough, MA (United States); Johnson, B.K.; Stewart, J.R. [Power Technologies, Inc., Schenectady, NY (United States); Fink, J.L.

    1997-04-01

    The purpose of this study was to develop an assessment of the national utility system`s needs for electric transmission during the period 1995-2020 that could be met by future reduced-cost HVDC systems. The assessment was to include an economic evaluation of HVDC as a means for meeting those needs as well as a comparison with competing technologies such as ac transmission with and without Flexible AC Transmission System (FACTS) controllers. The role of force commutated dc converters was to be assumed where appropriate. The assessment begins by identifying the general needs for transmission in the U.S. in the context of a future deregulated power industry. The possible roles for direct current transmission are then postulated in terms of representative scenarios. A few of the scenarios are illustrated with the help of actual U.S. system examples. non-traditional applications as well as traditional applications such as long lines and asynchronous interconnections are discussed. The classical ``break-even distance`` concept for comparing HVDC and ac lines is used to assess the selected scenarios. The impact of reduced-cost converters is reflected in terms of the break-even distance. This report presents a comprehensive review of the functional benefits of HVDC transmission and updated cost data for both ac and dc system components. It also provides some provocative thoughts on how direct current transmission might be applied to better utilize and expand our nation`s increasingly stressed transmission assets.

  2. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    Science.gov (United States)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  3. Caveats for using statistical significance tests in research assessments

    OpenAIRE

    2011-01-01

    This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with s...

  4. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  5. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S.; Antoniou, I.; Dahlberg, J.A. [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  6. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  7. Assessing the Power of Exome Chips.

    Directory of Open Access Journals (Sweden)

    Christian Magnus Page

    Full Text Available Genotyping chips for rare and low-frequent variants have recently gained popularity with the introduction of exome chips, but the utility of these chips remains unclear. These chips were designed using exome sequencing data from mainly American-European individuals, enriched for a narrow set of common diseases. In addition, it is well-known that the statistical power of detecting associations with rare and low-frequent variants is much lower compared to studies exclusively involving common variants. We developed a simulation program adaptable to any exome chip design to empirically evaluate the power of the exome chips. We implemented the main properties of the Illumina HumanExome BeadChip array. The simulated data sets were used to assess the power of exome chip based studies for varying effect sizes and causal variant scenarios. We applied two widely-used statistical approaches for rare and low-frequency variants, which collapse the variants into genetic regions or genes. Under optimal conditions, we found that a sample size between 20,000 to 30,000 individuals were needed in order to detect modest effect sizes (0.5% 1% with 80% power. For small effect sizes (PAR <0.5%, 60,000-100,000 individuals were needed in the presence of non-causal variants. In conclusion, we found that at least tens of thousands of individuals are necessary to detect modest effects under optimal conditions. In addition, when using rare variant chips on cohorts or diseases they were not originally designed for, the identification of associated variants or genes will be even more challenging.

  8. Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring

    Science.gov (United States)

    Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat

    2015-01-01

    We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863

  9. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  10. Assessment of alternatives to correct inventory difference statistical treatment deficiencies

    Energy Technology Data Exchange (ETDEWEB)

    Byers, K.R.; Johnston, J.W.; Bennett, C.A.; Brouns, R.J.; Mullen, M.F.; Roberts, F.P.

    1983-11-01

    This document presents an analysis of alternatives to correct deficiencies in the statistical treatment of inventory differences in the NRC guidance documents and licensee practice. Pacific Northwest Laboratory's objective for this study was to assess alternatives developed by the NRC and a panel of safeguards statistical experts. Criteria were developed for the evaluation and the assessment was made considering the criteria. The results of this assessment are PNL recommendations, which are intended to provide NRC decision makers with a logical and statistically sound basis for correcting the deficiencies.

  11. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  12. Model of risk assessment under ballistic statistical tests

    Science.gov (United States)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  13. Caveats for using statistical significance tests in research assessments

    CERN Document Server

    Schneider, Jesper W

    2011-01-01

    This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We argue that applying statistical significance tests and mechanically adhering to their results is highly problematic and detrimental to critical thinki...

  14. APPLICATION OF THE UNIFIED STATISTICAL MATERIAL DATABASE FOR DESIGN AND LIFE/RISK ASSESSMENT OF HIGH TEMPERATURE COMPONENTS

    Institute of Scientific and Technical Information of China (English)

    K.Fujiyama; T.Fujiwara; Y.Nakatani; K.Saito; A.Sakuma; Y.Akikuni; S.Hayashi; S.Matsumoto

    2004-01-01

    Statistical manipulation of material data was conducted for probabilistic life assessment or risk-based design and maintenance for high temperature components of power plants. To obtain the statistical distribution of material properties, dominant parameters affecting material properties are introduced into normalization of statistical variables. Those parameters are hardness, chemical composition, characteristic microstructural features and so on. Creep and fatigue properties are expressed by normalized parameters and the unified statistical distributions are obtained. These probability distribution functions show good coincidence statistically with the field database of steam turbine components. It was concluded that the unified statistical baseline approach is useful for the risk management of components in power plants.

  15. Environmental Assessment for power marketing policy for Southwestern Power Administration

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    Southwestern Power Administration (Southwestern) needs to renew expiring power sales contracts with new term (10 year) sales contracts. The existing contracts have been in place for several years and many will expire over the next ten years. Southwestern completed an Environmental Assessment on the existing power allocation in June, 1979 (a copy of the EA is attached), and there are no proposed additions of any major new generation resources, service to discrete major new loads, or major changes in operating parameters, beyond those included in the existing power allocation. Impacts from a no action plan, proposed alternative, and market power for less than 10 years are described.

  16. The effect of cluster size variability on statistical power in cluster-randomized trials.

    Directory of Open Access Journals (Sweden)

    Stephen A Lauer

    Full Text Available The frequency of cluster-randomized trials (CRTs in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power.

  17. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  18. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice...... of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We...

  19. The power and statistical behaviour of allele-sharing statistics when applied to models with two disease loci

    Indian Academy of Sciences (India)

    Yin Y. Shugart; Bing-Jian Feng; Andrew Collins

    2002-11-01

    We have evaluated the power for detecting a common trait determined by two loci, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single disease locus, we have used a simulated data set of general pedigrees in which a two-locus disease is segregating and evaluated several non-parametric linkage statistics implemented in the two programs. We found that the power for detecting linkage using the $S_{\\text{all}}$ statistic in GENEHUNTER (GH, version 2.1), implemented as statistic in SimWalk2 (version 2.82), is different in the two. The values associated with statistic output by SimWalk2 are consistently more conservative than those from GENEHUNTER except when the underlying model includes heterogeneity at a level of 50% where the values output are very comparable. On the other hand, when the thresholds are determined empirically under the null hypothesis, $S_{\\text{all}}$ in GENEHUNTER and statistic have similar power.

  20. Gene set analysis for GWAS: assessing the use of modified Kolmogorov-Smirnov statistics.

    Science.gov (United States)

    Debrabant, Birgit; Soerensen, Mette

    2014-10-01

    We discuss the use of modified Kolmogorov-Smirnov (KS) statistics in the context of gene set analysis and review corresponding null and alternative hypotheses. Especially, we show that, when enhancing the impact of highly significant genes in the calculation of the test statistic, the corresponding test can be considered to infer the classical self-contained null hypothesis. We use simulations to estimate the power for different kinds of alternatives, and to assess the impact of the weight parameter of the modified KS statistic on the power. Finally, we show the analogy between the weight parameter and the genesis and distribution of the gene-level statistics, and illustrate the effects of differential weighting in a real-life example.

  1. Wind power assessment in Uruguay

    Energy Technology Data Exchange (ETDEWEB)

    Cataldo, J. [Universidad de la Republica, Montevideo (Uruguay). Instituto de Mecanica de los Fluidos e Ingenieria Ambiental; Nunes, V. [Universidad de la Republica, Montevideo (Uruguay). Instituto de Ingeneria Electrica

    1996-09-01

    The wind power as a large alternative energy source appear in Uruguay. A nested method to obtain the mean wind velocity time series at complex terrain sites and describe the turbulence was developed. Sites with mean velocity over 9m/s and capacity factor over 40% were found. The aerodynamic interferences loss between wind generators using a numerical model were evaluated and a numerical model was developed to design an optimal cluster wind farm. As bulk result, an installed capacity of 300MW with a cost production less than 0.065U$S/kW.h can be estimated over the all studied Region. (author)

  2. Statistical Power Analysis with Missing Data A Structural Equation Modeling Approach

    CERN Document Server

    Davey, Adam

    2009-01-01

    Statistical power analysis has revolutionized the ways in which we conduct and evaluate research.  Similar developments in the statistical analysis of incomplete (missing) data are gaining more widespread applications. This volume brings statistical power and incomplete data together under a common framework, in a way that is readily accessible to those with only an introductory familiarity with structural equation modeling.  It answers many practical questions such as: How missing data affects the statistical power in a study How much power is likely with different amounts and types

  3. Comparative environmental assessment of unconventional power installations

    Science.gov (United States)

    Sosnina, E. N.; Masleeva, O. V.; Kryukov, E. V.

    2015-08-01

    Procedure of the strategic environmental assessment of the power installations operating on the basis of renewable energy sources (RES) was developed and described. This procedure takes into account not only the operational process of the power installation but also the whole life cycles: from the production and distribution of power resources for manufacturing of the power installations to the process of their recovery. Such an approach gives an opportunity to make a more comprehensive assessment of the influence of the power installations on environments and may be used during adaptation of the current regulations and development of new regulations for application of different types of unconventional power installations with due account of the ecological factor. Application of the procedure of the integrated environmental assessment in the context of mini-HPP (Hydro Power Plant); wind, solar, and biogas power installations; and traditional power installation operating natural gas was considered. Comparison of environmental influence revealed advantages of new energy technologies compared to traditional ones. It is shown that solar energy installations hardly pollute the environment during operation, but the negative influence of the mining operations and manufacturing and utilization of the materials used for solar modules is maximum. Biogas power installations are on the second place as concerns the impact on the environment due to the considerable mass of the biogas installation and gas reciprocating engine. The minimum impact on the environment is exerted by the mini-HPP. Consumption of material and energy resources for the production of the traditional power installation is less compared to power installations on RES; however, this factor incomparably increases when taking into account the fuel extraction and transfer. The greatest impact on the environment is exerted by the operational process of the traditional power installations.

  4. The Role of Previous Experience and Attitudes toward Statistics in Statistics Assessment Outcomes among Undergraduate Psychology Students

    Science.gov (United States)

    Dempster, Martin; McCorry, Noleen K.

    2009-01-01

    Previous research has demonstrated that students' cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students' previous experiences of maths, statistics and computing; their attitudes toward statistics;…

  5. Basic Statistic Data of Chinese Power Industry in 2005

    Institute of Scientific and Technical Information of China (English)

    Department of Statistics & Information, China Elec

    2006-01-01

    @@ The Chinese power industry developed rapidly in the period of 2003 to 2004, but the situation of supply and demand was pregnant with grim possibilities. In 2005, the power industry all over the country maintained a high-speed development as before. Power supply and demand retained a tense situation in some areas, but as compared with the last two years, the situation was relieved a little.

  6. Assessing agreement with multiple raters on correlated kappa statistics.

    Science.gov (United States)

    Cao, Hongyuan; Sen, Pranab K; Peery, Anne F; Dellon, Evan S

    2016-07-01

    In clinical studies, it is often of interest to see the diagnostic agreement among clinicians on certain symptoms. Previous work has focused on the agreement between two clinicians under two different conditions or the agreement among multiple clinicians under one condition. Few have discussed the agreement study with a design where multiple clinicians examine the same group of patients under two different conditions. In this paper, we use the intraclass kappa statistic for assessing nominal scale agreement with such a design. We derive an explicit variance formula for the difference of correlated kappa statistics and conduct hypothesis testing for the equality of kappa statistics. Simulation studies show that the method performs well with realistic sample sizes and may be superior to a method that did not take into account the measurement dependence structure. The practical utility of the method is illustrated on data from an eosinophilic esophagitis (EoE) study.

  7. Comparisons of power of statistical methods for gene-environment interaction analyses.

    Science.gov (United States)

    Ege, Markus J; Strachan, David P

    2013-10-01

    Any genome-wide analysis is hampered by reduced statistical power due to multiple comparisons. This is particularly true for interaction analyses, which have lower statistical power than analyses of associations. To assess gene-environment interactions in population settings we have recently proposed a statistical method based on a modified two-step approach, where first genetic loci are selected by their associations with disease and environment, respectively, and subsequently tested for interactions. We have simulated various data sets resembling real world scenarios and compared single-step and two-step approaches with respect to true positive rate (TPR) in 486 scenarios and (study-wide) false positive rate (FPR) in 252 scenarios. Our simulations confirmed that in all two-step methods the two steps are not correlated. In terms of TPR, two-step approaches combining information on gene-disease association and gene-environment association in the first step were superior to all other methods, while preserving a low FPR in over 250 million simulations under the null hypothesis. Our weighted modification yielded the highest power across various degrees of gene-environment association in the controls. An optimal threshold for step 1 depended on the interacting allele frequency and the disease prevalence. In all scenarios, the least powerful method was to proceed directly to an unbiased full interaction model, applying conventional genome-wide significance thresholds. This simulation study confirms the practical advantage of two-step approaches to interaction testing over more conventional one-step designs, at least in the context of dichotomous disease outcomes and other parameters that might apply in real-world settings.

  8. A Commentary on Statistical Assessment of Violence Recidivism Risk

    OpenAIRE

    Imrey, Peter B.; Dawid, A. Philip

    2015-01-01

    Increasing integration and availability of data on large groups of persons has been accompanied by proliferation of statistical and other algorithmic prediction tools in banking, insurance, marketiNg, medicine, and other FIelds (see e.g., Steyerberg (2009a;b)). Controversy may ensue when such tools are introduced to fields traditionally reliant on individual clinical evaluations. Such controversy has arisen about "actuarial" assessments of violence recidivism risk, i.e., the probability that ...

  9. In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.

    Science.gov (United States)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat

    2014-11-01

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells.

  10. Fundamentals of modern statistical methods substantially improving power and accuracy

    CERN Document Server

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  11. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    Science.gov (United States)

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level.

  12. Equivalence versus classical statistical tests in water quality assessments.

    Science.gov (United States)

    Ngatia, Murage; Gonzalez, David; San Julian, Steve; Conner, Arin

    2010-01-01

    To evaluate whether two unattended field organic carbon instruments could provide data comparable to laboratory-generated data, we needed a practical assessment. Null hypothesis statistical testing (NHST) is commonly utilized for such evaluations in environmental assessments, but researchers in other disciplines have identified weaknesses that may limit NHST's usefulness. For example, in NHST, large sample sizes change p-values and a statistically significant result can be obtained by merely increasing the sample size. In addition, p-values can indicate that observed results are statistically significantly different, but in reality the differences could be trivial in magnitude. Equivalence tests, on the other hand, allow the investigator to incorporate decision criteria that have practical relevance to the study. In this paper, we demonstrate the potential use of equivalence tests as an alternative to NHST. We first compare data between the two field instruments, and then compare the field instruments' data to laboratory-generated data using both NHST and equivalence tests. NHST indicated that the data between the two field instruments and the data between the field instruments and the laboratory were significantly different. Equivalence tests showed that the data were equivalent because they fell within a pre-determined equivalence interval based on our knowledge of laboratory precision. We conclude that equivalence tests provide more useful comparisons and interpretation of water quality data than NHST and should be more widely used in similar environmental assessments.

  13. The Statistics Concept Inventory: Development and analysis of a cognitive assessment instrument in statistics

    Science.gov (United States)

    Allen, Kirk

    The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented

  14. Enrichment of statistical power for genome-wide association studies

    Science.gov (United States)

    The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...

  15. Afterglow Light Curves and Broken Power Laws: A Statistical Study

    CERN Document Server

    J'ohannesson, G; Gudmundsson, E H; J\\'ohannesson, Gudlaugur; Bj\\"ornsson, Gunnlaugur; Gudmundsson, Einar H.

    2006-01-01

    In gamma-ray burst research it is quite common to fit the afterglow light curves with a broken power law to interpret the data. We apply this method to a computer simulated population of afterglows and find systematic differences between the known model parameters of the population and the ones derived from the power law fits. In general, the slope of the electron energy distribution is overestimated from the pre-break light curve slope while being underestimated from the post-break slope. We also find that the jet opening angle derived from the fits is overestimated in narrow jets and underestimated in wider ones. Results from fitting afterglow light curves with broken power laws must therefore be interpreted with caution since the uncertainties in the derived parameters might be larger than estimated from the fit. This may have implications for Hubble diagrams constructed using gamma-ray burst data.

  16. Seismic reliability assessment of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Singhal, A. [Stanford Univ., CA (United States); Bouabid, J. [Risk Management Solutions, Menlo Park, CA (United States)

    1995-12-31

    This paper presents a methodology for the seismic risk assessment of electric power systems. In evaluating damage and loss of functionality to the electric power components, fragility curves and restoration functions are used. These vulnerability parameters are extracted from the GIS-based regional loss estimation methodology being developed for the US. Observed damage in electric power components during the Northridge earthquake is used to benchmark the methodology. The damage predicted using these vulnerability parameters is found to be in good agreement with the damage observed during the earthquake.

  17. Statistical Pattern-Based Assessment of Structural Health Monitoring Data

    Directory of Open Access Journals (Sweden)

    Mohammad S. Islam

    2014-01-01

    Full Text Available In structural health monitoring (SHM, various sensors are installed at critical locations of a structure. The signals from sensors are either continuously or periodically analyzed to determine the state and performance of the structure. An objective comparison of the sensor data at different time ranges is essential for assessing the structural condition or excessive load experienced by the structure which leads to potential damage in the structure. The objectives of the current study are to establish a relationship between the data from various sensors to estimate the reliability of the data and potential damage using the statistical pattern matching techniques. In order to achieve these goals, new methodologies based on statistical pattern recognition techniques have been developed. The proposed methodologies have been developed and validated using sensor data obtained from an instrumented bridge and road test data from heavy vehicles. The application of statistical pattern matching techniques are relatively new in SHM data interpretation and current research demonstrates that it has high potential in assessing structural conditions, especially when the data are noisy and susceptible to environmental disturbances.

  18. Mathematical Power: Exploring Critical Pedagogy in Mathematics and Statistics

    Science.gov (United States)

    Lesser, Lawrence M.; Blake, Sally

    2007-01-01

    Though traditionally viewed as value-free, mathematics is actually one of the most powerful, yet underutilized, venues for working towards the goals of critical pedagogy--social, political and economic justice for all. This emerging awareness is due to how critical mathematics educators such as Frankenstein, Skovsmose and Gutstein have applied the…

  19. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  20. Statistical Power of Psychological Research: What Have We Gained in 20 Years?

    Science.gov (United States)

    Rossi, Joseph S.

    1990-01-01

    Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…

  1. Near and Far from Equilibrium Power-Law Statistics

    CERN Document Server

    Biro, Tamas S; Biro, Gabor; Shen, Ke Ming

    2016-01-01

    We analyze the connection between $p_T$ and multiplicity distributions in a statistical framework. We connect the Tsallis parameters, $T$ and $q$, to physical properties like average energy per particle and the second scaled factorial moment, $F_2=\\langle n(n-1) \\rangle / {\\langle n \\rangle}^2$, measured in multiplicity distributions. Near and far from equilibrium scenarios with master equations for the probability of having $n$ particles, $P_n$, are reviewed based on hadronization transition rates, $\\mu_n$, from $n$ to $n+1$ particles.

  2. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  3. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Science.gov (United States)

    Heidel, R. Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717

  4. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  5. Combining heuristic and statistical techniques in landslide hazard assessments

    Science.gov (United States)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  6. Power-law distributions in economics: a nonextensive statistical approach

    CERN Document Server

    Queiros, S M D; Tsallis, C; Queiros, Silvio M. Duarte; Anteneodo, Celia; Tsallis, Constantino

    2005-01-01

    The cornerstone of Boltzmann-Gibbs ($BG$) statistical mechanics is the Boltzmann-Gibbs-Jaynes-Shannon entropy $S_{BG} \\equiv -k\\int dx f(x)\\ln f(x)$, where $k$ is a positive constant and $f(x)$ a probability density function. This theory has exibited, along more than one century, great success in the treatment of systems where short spatio/temporal correlations dominate. There are, however, anomalous natural and artificial systems that violate the basic requirements for its applicability. Different physical entropies, other than the standard one, appear to be necessary in order to satisfactorily deal with such anomalies. One of such entropies is $S_q \\equiv k (1-\\int dx [f(x)]^q)/(1-q)$ (with $S_1=S_{BG}$), where the entropic index $q$ is a real parameter. It has been proposed as the basis for a generalization, referred to as {\\it nonextensive statistical mechanics}, of the $BG$ theory. $S_q$ shares with $S_{BG}$ four remarkable properties, namely {\\it concavity} ($\\forall q>0$), {\\it Lesche-stability} ($\\for...

  7. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    ), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product......-moment correlation coefficient (r) between the results of the two measurements methods as an indicator of agreement, which is wrong. There have been proposed several alternative methods, which we will describe together with preconditions for use of the methods....

  8. Preliminary statistical assessment towards characterization of biobotic control.

    Science.gov (United States)

    Latif, Tahmid; Meng Yang; Lobaton, Edgar; Bozkurt, Alper

    2016-08-01

    Biobotic research involving neurostimulation of instrumented insects to control their locomotion is finding potential as an alternative solution towards development of centimeter-scale distributed swarm robotics. To improve the reliability of biobotic agents, their control mechanism needs to be precisely characterized. To achieve this goal, this paper presents our initial efforts for statistical assessment of the angular response of roach biobots to the applied bioelectrical stimulus. Subsequent findings can help to understand the effect of each stimulation parameter individually or collectively and eventually reach reliable and consistent biobotic control suitable for real life scenarios.

  9. Efficiency statistics at all times: Carnot limit at finite power.

    Science.gov (United States)

    Polettini, M; Verley, G; Esposito, M

    2015-02-01

    We derive the statistics of the efficiency under the assumption that thermodynamic fluxes fluctuate with normal law, parametrizing it in terms of time, macroscopic efficiency, and a coupling parameter ζ. It has a peculiar behavior: no moments, one sub-, and one super-Carnot maxima corresponding to reverse operating regimes (engine or pump), the most probable efficiency decreasing in time. The limit ζ→0 where the Carnot bound can be saturated gives rise to two extreme situations, one where the machine works at its macroscopic efficiency, with Carnot limit corresponding to no entropy production, and one where for a transient time scaling like 1/ζ microscopic fluctuations are enhanced in such a way that the most probable efficiency approaches the Carnot limit at finite entropy production.

  10. Statistical modeling and analysis of the influence of antenna polarization error on received power

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The problem of statistical modeling of antenna polarization error is studied and the statistical characteristics of antenna's received power are analyzed. A novel Stokes-vector-based method is presented to describe the conception of antenna's polarization purity. Statistical model of antenna's polarization error in polarization domain is then built up. When an antenna with polarization error of uniform distribution is illuminated by an arbitrary polarized incident field, the probability density of antenna's received power is derived analytically. Finally, a group of curves of deviation and standard deviation of received power are plotted numerically.

  11. Statistical assessment of quality of credit activity of Ukrainian banks

    Directory of Open Access Journals (Sweden)

    Moldavska Olena V.

    2013-03-01

    Full Text Available The article conducts an economic and statistical analysis of the modern state of credit activity of Ukrainian banks and main tendencies of its development. It justifies urgency of the statistical study of credit activity of banks. It offers a complex system of assessment of bank lending at two levels: the level of the banking system and the level of an individual bank. The use of the system analysis allows reflection of interconnection between effectiveness of functioning of the banking system and quality of the credit portfolio. The article considers main aspects of management of quality of the credit portfolio – level of troubled debt and credit risk. The article touches the problem of adequate quantitative assessment of troubled loans in the credit portfolios of banks, since the methodologies of its calculation used by the National Bank of Ukraine and international rating agencies are quite different. The article presents a system of methods of management of credit risk, both theoretically and providing specific examples, in the context of prevention of occurrence of risk situations or elimination of their consequences.

  12. On the statistical assessment of classifiers using DNA microarray data

    Directory of Open Access Journals (Sweden)

    Carella M

    2006-08-01

    Full Text Available Abstract Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22 and tumor (25 specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045 as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS and Support Vector Machines (SVM classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035 and e = 18% (p = 0.037 respectively. Moreover, the error rate

  13. Statistical tools to improve assessing agreement between several observers.

    Science.gov (United States)

    Ruddat, I; Scholz, B; Bergmann, S; Buehring, A-L; Fischer, S; Manton, A; Prengel, D; Rauch, E; Steiner, S; Wiedmann, S; Kreienbrock, L; Campe, A

    2014-04-01

    In the context of assessing the impact of management and environmental factors on animal health, behaviour or performance it has become increasingly important to conduct (epidemiological) studies in the field. Hence, the number of investigated farms per study is considerably high so that numerous observers are needed for investigation. In order to maintain the quality and validity of study results calibration meetings where observers are trained and the current level of agreement is assessed have to be conducted to minimise the observer effect. When study animals were rated independently by the same observers by a categorical variable the exclusion test can be performed to identify disagreeing observers. This statistical test compares for each variable and each observer the observer-specific agreement with the overall agreement among all observers based on kappa coefficients. It accounts for two major challenges, namely the absence of a gold-standard observer and different data type comprising ordinal, nominal and binary data. The presented methods are applied on a reliability study to assess the agreement among eight observers rating welfare parameters of laying hens. The degree to which the observers agreed depended on the investigated item (global weighted kappa coefficients: 0.37 to 0.94). The proposed method and graphical description served to assess the direction and degree to which an observer deviates from the others. It is suggested to further improve studies with numerous observers by conducting calibration meetings and accounting for observer bias.

  14. Statistical Power Flow Analysis of an Imperfect Ribbed Cylinder

    Science.gov (United States)

    Blakemore, M.; Woodhouse, J.; Hardie, D. J. W.

    1999-05-01

    Prediction of the noise transmitted from machinery and flow sources on a submarine to the sonar arrays poses a complex problem. Vibrations in the pressure hull provide the main transmission mechanism. The pressure hull is characterised by a very large number of modes over the frequency range of interest (at least 100,000) and by high modal overlap, both of which place its analysis beyond the scope of finite element or boundary element methods. A method for calculating the transmission is presented, which is broadly based on Statistical Energy Analysis, but extended in two important ways: (1) a novel subsystem breakdown which exploits the particular geometry of a submarine pressure hull; (2) explicit modelling of energy density variation within a subsystem due to damping. The method takes account of fluid-structure interaction, the underlying pass/stop band characteristics resulting from the near-periodicity of the pressure hull construction, the effect of vibration isolators such as bulkheads, and the cumulative effect of irregularities (e.g., attachments and penetrations).

  15. Violation of statistical isotropy and homogeneity in the 21-cm power spectrum

    CERN Document Server

    Shiraishi, Maresuke; Kamionkowski, Marc; Raccanelli, Alvise

    2016-01-01

    Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic-Microwave-Background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm-line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving u...

  16. Low statistical power in biomedical science: a review of three human research domains

    Science.gov (United States)

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.

  17. Direct flash steam geothermal power plant assessment

    Science.gov (United States)

    Alt, T. E.

    1982-01-01

    The objective was to analyze the capacity and availability factors of an operating direct flash geothermal power plant. System and component specifications, operating procedures, maintenance history, malfunctions, and outage rate are discussed. The plant studied was the 75 MW(e) geothermal power plant at Cerro Prieto, Mexico, for the years 1973 to 1979. To describe and assess the plant, the project staff reviewed documents, visited the plant, and met with staff of the operating utility. The high reliability and availability of the plant was documented and actions responsible for the good performance were identified and reported. The results are useful as guidance to US utilities considering use of hot water geothermal resources for power generation through a direct flash conversion cycle.

  18. Nuclear power plant security assessment technical manual.

    Energy Technology Data Exchange (ETDEWEB)

    O' Connor, Sharon L.; Whitehead, Donnie Wayne; Potter, Claude S., III

    2007-09-01

    This report (Nuclear Power Plant Security Assessment Technical Manual) is a revision to NUREG/CR-1345 (Nuclear Power Plant Design Concepts for Sabotage Protection) that was published in January 1981. It provides conceptual and specific technical guidance for U.S. Nuclear Regulatory Commission nuclear power plant design certification and combined operating license applicants as they: (1) develop the layout of a facility (i.e., how buildings are arranged on the site property and how they are arranged internally) to enhance protection against sabotage and facilitate the use of physical security features; (2) design the physical protection system to be used at the facility; and (3) analyze the effectiveness of the PPS against the design basis threat. It should be used as a technical manual in conjunction with the 'Nuclear Power Plant Security Assessment Format and Content Guide'. The opportunity to optimize physical protection in the design of a nuclear power plant is obtained when an applicant utilizes both documents when performing a security assessment. This document provides a set of best practices that incorporates knowledge gained from more than 30 years of physical protection system design and evaluation activities at Sandia National Laboratories and insights derived from U.S. Nuclear Regulatory Commission technical staff into a manual that describes a development and analysis process of physical protection systems suitable for future nuclear power plants. In addition, selected security system technologies that may be used in a physical protection system are discussed. The scope of this document is limited to the identification of a set of best practices associated with the design and evaluation of physical security at future nuclear power plants in general. As such, it does not provide specific recommendations for the design and evaluation of physical security for any specific reactor design. These best practices should be applicable to the design and

  19. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  20. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    Science.gov (United States)

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  1. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  2. Statistical-Based Joint Power Control for Wireless Ad Hoc CDMA Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANGShu; RONGMongtian; CHENBo

    2005-01-01

    Current power control algorithm for CDMA-based ad hoc networks contains SIR and interference measurement, which is based on history information. However, for the traffics in today's or future's network, important statistical property is burstiness. As a consequence, the interference at a given receiving node may fluctuate dramatically. So the convergence speed of power control is not fast and the performance degrades. This paper presents a joint power control model. To a receiving node, all transmitting nodes assigned in same time slot adjust their transmitter power based on current information, which takes into account the adjustments of transmitter power of other transmitting nodes. Based on the joint power control model, this paper proposes a statisticalbased power control algorithm. Through this new algorithm, the interference is estimated more exactly. The simulation results indicated that the proposed power control algorithm outperforms the old algorithm.

  3. Geotechnical assessments of upgrading power transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Andrew [Coffey Geotechnics Ltd., Harrogate (United Kingdom)

    2012-11-01

    One of the consequences of increasing demand for energy is a corresponding requirement for increased energy distribution. This trend is likely to be magnified by the current tendency to generate power in locations remote from centres of population. New power transmission routes are expensive and awkward to develop, and there are therefore benefits to be gained by upgrading existing routes. However, this in turn raises problems of a different nature. The re-use of any structure must necessarily imply the acceptance of unknowns. The upgrading of transmission lines is no exception to this, particularly when assessing foundations, which in their nature are not visible. A risk-based approach is therefore used. This paper describes some of the geotechnical aspects of the assessment of electric power transmission lines for upgrading. It briefly describes the background, then discusses some of the problems encountered and the methods used to address them. These methods are based mainly on information obtained from desk studies and walkover surveys, with a limited amount of intrusive investigation. (orig.)

  4. An assessment of recently published gene expression data analyses: reporting experimental design and statistical factors

    Directory of Open Access Journals (Sweden)

    Azuaje Francisco

    2006-06-01

    Full Text Available Abstract Background The analysis of large-scale gene expression data is a fundamental approach to functional genomics and the identification of potential drug targets. Results derived from such studies cannot be trusted unless they are adequately designed and reported. The purpose of this study is to assess current practices on the reporting of experimental design and statistical analyses in gene expression-based studies. Methods We reviewed hundreds of MEDLINE-indexed papers involving gene expression data analysis, which were published between 2003 and 2005. These papers were examined on the basis of their reporting of several factors, such as sample size, statistical power and software availability. Results Among the examined papers, we concentrated on 293 papers consisting of applications and new methodologies. These papers did not report approaches to sample size and statistical power estimation. Explicit statements on data transformation and descriptions of the normalisation techniques applied prior to data analyses (e.g. classification were not reported in 57 (37.5% and 104 (68.4% of the methodology papers respectively. With regard to papers presenting biomedical-relevant applications, 41(29.1 % of these papers did not report on data normalisation and 83 (58.9% did not describe the normalisation technique applied. Clustering-based analysis, the t-test and ANOVA represent the most widely applied techniques in microarray data analysis. But remarkably, only 5 (3.5% of the application papers included statements or references to assumption about variance homogeneity for the application of the t-test and ANOVA. There is still a need to promote the reporting of software packages applied or their availability. Conclusion Recently-published gene expression data analysis studies may lack key information required for properly assessing their design quality and potential impact. There is a need for more rigorous reporting of important experimental

  5. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  6. Monte Carlo based statistical power analysis for mediation models: methods and software.

    Science.gov (United States)

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  7. The power of alternative assessments (AAs)

    Institute of Scientific and Technical Information of China (English)

    张千茜

    2013-01-01

    This article starts by discussing the potential disadvantages of traditional assessment towards young English as a Second Language (ESL) learners within the American public school education system. In response to such disadvantages, researchers ’call for the implementation of alternative assessments (AAs) is therefore introduced along with the various benefits of AAs. However, the current mainstream education policy in the US, namely No Child Left Behind (NCLB) Policy, is still largely based on the tra-ditional ways of testing, making policy-oriented implementation of AAs on large scales remarkably difficult. After careful analysis, the author points out several implications concerning how, under such an existing policy of NCLB, can practitioners effectively accommodate young ESL learners by applying the power of AAs.

  8. Implications of primordial power spectra with statistical anisotropy on CMB temperature fluctuation and polarizations

    CERN Document Server

    Chang, Zhe

    2013-01-01

    Both the Wilkinson Microwave Anisotropy Probe (WMAP) and Planck observations reported the hemispherical asymmetry of the cosmic microwave background (CMB) temperature fluctuation. The hemispherical asymmetry might be stemmed from the primordial statistical anisotropy during the inflationary era of the universe. In this paper, we study possible implications of the primordial power spectra with dipolar anisotropy on the CMB temperature fluctuation and polarizations. We explicitly show that the statistical dipolar anisotropy may induce the off-diagonal (\\(\\ell'\

  9. New statistic for financial return distributions: power-law or exponential?

    CERN Document Server

    Pisarenko, V F

    2004-01-01

    We introduce a new statistical tool (the TP-statistic and TE-statistic) designed specifically to compare the behavior of the sample tail of distributions with power-law and exponential tails as a function of the lower threshold u. One important property of these statistics is that they converge to zero for power laws or for exponentials correspondingly, regardless of the value of the exponent or of the form parameter. This is particularly useful for testing the structure of a distribution (power law or not, exponential or not) independently of the possibility of quantifying the values of the parameters. We apply these statistics to the distribution of returns of one century of daily data for the Dow Jones Industrial Average and over one year of 5-minutes data of the Nasdaq Composite index. Our analysis confirms previous works showing the tendency for the tails to resemble more and more a power law for the highest quantiles but we can detect clear deviations that suggest that the structure of the tails of the ...

  10. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  11. Statistical Power Supply Dynamic Noise Prediction in Hierarchical Power Grid and Package Networks

    OpenAIRE

    Piccinini, Gianluca; Graziano, Mariagrazia

    2008-01-01

    One of the most crucial high performance systems-on-chip design challenge is to front their power supply noise sufferance due to high frequencies, huge number of functional blocks and technology scaling down. Marking a difference from traditional post physical-design static voltage drop analysis, /a priori dynamic voltage drop/evaluation is the focus of this work. It takes into account transient currents and on-chip and package /RLC/ parasitics while exploring the power grid design solution s...

  12. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  13. Thermal Enhancement of Silicon Carbide (SiC) Power Electronics and Laser Bars: Statistical Design Optimization of a Liquid-Cooled Power Electronic Heat Sink

    Science.gov (United States)

    2015-08-01

    AFRL-RQ-WP-TR-2015-0138 THERMAL ENHANCEMENT OF SILICON CARBIDE (SiC) POWER ELECTRONICS AND LASER BARS: Statistical Design Optimization of a Liquid...Cooled Power Electronic Heat Sink James D. Scofield Electrical Systems Branch Power and Control Division AUGUST2015 Final Report Approved for...CARBIDE (SiC) POWER ELECTRONICS AND LASER BARS: Statistical Design Optimization of a Liquid-Cooled Power Electronic Heat Sink 5a. CONTRACT NUMBER

  14. Atomic Bomb Survivors Life-Span Study: Insufficient Statistical Power to Select Radiation Carcinogenesis Model.

    Science.gov (United States)

    Socol, Yehoshua; Dobrzyński, Ludwik

    2015-01-01

    The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required.

  15. Jacobian integration method increases the statistical power to measure gray matter atrophy in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Kunio Nakamura

    2014-01-01

    Full Text Available Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials.

  16. Waste Heat to Power Market Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Elson, Amelia [ICF International, Fairfax, VA (United States); Tidball, Rick [ICF International, Fairfax, VA (United States); Hampson, Anne [ICF International, Fairfax, VA (United States)

    2015-03-01

    Waste heat to power (WHP) is the process of capturing heat discarded by an existing process and using that heat to generate electricity. In the industrial sector, waste heat streams are generated by kilns, furnaces, ovens, turbines, engines, and other equipment. In addition to processes at industrial plants, waste heat streams suitable for WHP are generated at field locations, including landfills, compressor stations, and mining sites. Waste heat streams are also produced in the residential and commercial sectors, but compared to industrial sites these waste heat streams typically have lower temperatures and much lower volumetric flow rates. The economic feasibility for WHP declines as the temperature and flow rate decline, and most WHP technologies are therefore applied in industrial markets where waste heat stream characteristics are more favorable. This report provides an assessment of the potential market for WHP in the industrial sector in the United States.

  17. Statistical Analysis of the Impact of Wind Power on Market Quantities and Power Flows

    DEFF Research Database (Denmark)

    Pinson, Pierre; Jónsson, Tryggvi; Zugno, Marco

    2012-01-01

    in Europe. Due to the dimensionality and nonlinearity of these effects, the necessary concepts of dimension reduction using Principal Component Analysis (PCA), as well as nonlinear regression are described. Example application results are given for European cross-border flows, as well as for the impact...... of load and wind power forecasts on Danish and German electricity markets....

  18. Power Systems Development Facility. Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    The objective of the PSDF would be to provide a modular facility which would support the development of advanced, pilot-scale, coal-based power systems and hot gas clean-up components. These pilot-scale components would be designed to be large enough so that the results can be related and projected to commercial systems. The facility would use a modular approach to enhance the flexibility and capability for testing; consequently, overall capital and operating costs when compared with stand-alone facilities would be reduced by sharing resources common to different modules. The facility would identify and resolve technical barrier, as well as-provide a structure for long-term testing and performance assessment. It is also intended that the facility would evaluate the operational and performance characteristics of the advanced power systems with both bituminous and subbituminous coals. Five technology-based experimental modules are proposed for the PSDF: (1) an advanced gasifier module, (2) a fuel cell test module, (3) a PFBC module, (4) a combustion gas turbine module, and (5) a module comprised of five hot gas cleanup particulate control devices. The final module, the PCD, would capture coal-derived ash and particles from both the PFBC and advanced gasifier gas streams to provide for overall particulate emission control, as well as to protect the combustion turbine and the fuel cell.

  19. Statistical modeling of the power grid from a wind farm standpoint

    DEFF Research Database (Denmark)

    Farajzadeh, Saber; Ramezani, Mohammad H.; Nielsen, Peter;

    2017-01-01

    In this study, we derive a statistical model of a power grid from the wind farm's standpoint based on dynamic principal component analysis. The main advantages of our model compared to the previously developed models are twofold. Firstly, our proposed model benefits from logged data of an offshore...

  20. Comparison of Three Common Experimental Designs to Improve Statistical Power When Data Violate Parametric Assumptions.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    A Monte Carlo technique was used to investigate the small sample goodness of fit and statistical power of several nonparametric tests and their parametric analogues when applied to data which violate parametric assumptions. The motivation was to facilitate choice among three designs, simple random assignment with and without a concomitant variable…

  1. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    Science.gov (United States)

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  2. On Statistics of Log-Ratio of Arithmetic Mean to Geometric Mean for Nakagami-m Fading Power

    Science.gov (United States)

    Wang, Ning; Cheng, Julian; Tellambura, Chintha

    To assess the performance of maximum-likelihood (ML) based Nakagami m parameter estimators, current methods rely on Monte Carlo simulation. In order to enable the analytical performance evaluation of ML-based m parameter estimators, we study the statistical properties of a parameter Δ, which is defined as the log-ratio of the arithmetic mean to the geometric mean for Nakagami-m fading power. Closed-form expressions are derived for the probability density function (PDF) of Δ. It is found that for large sample size, the PDF of Δ can be well approximated by a two-parameter Gamma PDF.

  3. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  4. CAPABILITY ASSESSMENT OF MEASURING EQUIPMENT USING STATISTIC METHOD

    Directory of Open Access Journals (Sweden)

    Pavel POLÁK

    2014-10-01

    Full Text Available Capability assessment of the measurement device is one of the methods of process quality control. Only in case the measurement device is capable, the capability of the measurement and consequently production process can be assessed. This paper deals with assessment of the capability of the measuring device using indices Cg and Cgk.

  5. Water Polo Game-Related Statistics in Women’s International Championships: Differences and Discriminatory Power

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.

    2012-01-01

    The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID

  6. Statistical evaluation of malfunctions in wind power plants; Statistische Fehlerauswertungen beim Windkraftwerksbetrieb zur Optimierung der Verfuegbarkeit

    Energy Technology Data Exchange (ETDEWEB)

    Fleischer, C.; Sucrow, W. [E.ON Energy Projects GmbH, Muenchen (Germany)

    2007-07-01

    New challenges by wind energy are risen, this means, that availabilities of wind power plants have to be increased as well as minimisation of breakdowns. Ultimately a retrenchment of operational management costs can be realised/achieved. The article gives a review of operational management's taken efforts to adjust manufacturer's frequently inadequate documentation to provide operations - after strenuous classification - with statistical evaluations of incoming error messages. These statistical evaluations lead to the identification of breakdown times as well as idleness times. Finally operation's costs can be monitored in cent per kilowatt hour. (orig.)

  7. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...

  8. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    NARCIS (Netherlands)

    Voet, van der H.; Perry, J.N.; Amzal, B.; Paoletti, C.

    2011-01-01

    Background - Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Re

  9. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    OpenAIRE

    Amzal Billy; Perry Joe N; van der Voet Hilko; Paoletti Claudia

    2011-01-01

    Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM) plant variety and a conventional non-GM counterpart, and for the assessment o...

  10. Structural vulnerability assessment of electric power grids

    NARCIS (Netherlands)

    Koç, Y.; Warnier, M.; Kooij, R.E.; Brazier, F.

    2014-01-01

    Cascading failures are the typical reasons of blackouts in power grids. The grid topology plays an important role in determining the dynamics of cascading failures in power grids. Measures for vulnerability analysis are crucial to assure a higher level of robustness of power grids. Metrics from Comp

  11. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation

    DEFF Research Database (Denmark)

    Hacke, Peter; Spataru, Sergiu

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated...... stress temperature, their use to determine the maximum power at 25°C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power...

  12. How Teachers Understand and Use Power in Alternative Assessment

    Directory of Open Access Journals (Sweden)

    Kelvin H. K. Tan

    2012-01-01

    Full Text Available “Alternative assessment” is an increasingly common and popular discourse in education. The potential benefit of alternative assessment practices is premised on significant changes in assessment practices. However, assessment practices embody power relations between institutions, teachers and students, and these power relationships determine the possibility and the extent of actual changes in assessment practices. Labelling a practice as “alternative assessment does not guarantee meaningful departure from existing practice. Recent research has warned that assessment practices in education cannot be presumed to empower students in ways that enhance their learning. This is partly due to a tendency to speak of power in assessment in undefined terms. Hence, it would be useful to identify the types of power present in assessment practices and the contexts which give rise to them. This paper seeks to examine power in the context of different ways that alternative assessment is practiced and understood by teachers. Research on teachers’ conceptions of alternative assessment is presented, and each of the conceptions is then analysed for insights into teachers’ meanings and practices of power. In particular, instances of sovereign, epistemological and disciplinary power in alternative assessment are identified to illuminate new ways of understanding and using alternative assessment.

  13. Using DEWIS and R for Multi-Staged Statistics e-Assessments

    Science.gov (United States)

    Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.

    2016-01-01

    We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…

  14. Development of nuclear power plant online monitoring system using statistical quality control

    Energy Technology Data Exchange (ETDEWEB)

    An, Sang Ha

    2006-02-15

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability.

  15. A Parameter-free Statistical Measurement of Halos with Power Spectra

    CERN Document Server

    He, P; Fang, L Z; He, Ping; Feng, Long-Long; Fang, Li-Zhi

    2005-01-01

    We show that, in the halo model of large-scale structure formation, the difference between the Fourier and the DWT (discrete wavelet transform) power spectra provides a statistical measurement of the halos. This statistical quantity is free from parameters related to the shape of the mass profile and the identification scheme of halos. That is, the statistical measurement is invariant in the sense that models with reasonably defined and selected parameters of the halo models should yield the same difference of the Fourier and DWT spectra. This feature is useful to extract ensemble averaged properties of halos, which cannot be obtained with the identification of individual halo. To demonstrate this point, we show with WIGEON hydrodynamical simulation samples that the spectrum difference provides a quantitative measurement of the discrepancy of the distribution of baryonic gas from that of the underlying dark matter field within halos. We also show that the mass density profile of halos in physical space can be...

  16. Statistical Design Model (SDM) of power supply and communication subsystem's Satellite

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza

    In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem

  17. Assessment - A Powerful Lever for Learning

    Directory of Open Access Journals (Sweden)

    Lorna Earl

    2010-05-01

    Full Text Available Classroom assessment practices have been part of schooling for hundreds of years. There are, however, new findings about the nature of learning and about the roles that assessment can play in enhancing learning for all students. This essay provides a brief history of the changing role of assessment in schooling, describes three different purposes for assessment and foreshadows some implications that shifting to a more differentiated view of assessment can have for policy, practice and research.

  18. Predicting future wind power generation and power demand in France using statistical downscaling methods developed for hydropower applications

    Science.gov (United States)

    Najac, Julien

    2014-05-01

    For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.

  19. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy.

  20. Capacity value assessments of wind power: Capacity value assessments of wind power

    Energy Technology Data Exchange (ETDEWEB)

    Milligan, Michael [National Renewable Energy Laboratory, Golden CO USA; Frew, Bethany [National Renewable Energy Laboratory, Golden CO USA; Ibanez, Eduardo [General Electric (GE) Energy Consulting, Schenectady NY USA; Kiviluoma, Juha [Valtion Teknillinen Tutkimuskeskus (VTT), Espoo Finland; Holttinen, Hannele [Valtion Teknillinen Tutkimuskeskus (VTT), Espoo Finland; Söder, Lennart [Royal Institute of Technology, Stockholm Sweden

    2016-10-05

    This article describes some of the recent research into the capacity value of wind power. With the worldwide increase in wind power during the past several years, there is increasing interest and significance regarding its capacity value because this has a direct influence on the amount of other (nonwind) capacity that is needed. We build on previous reviews from IEEE and IEA Wind Task 25a and examine recent work that evaluates the impact of multiple-year data sets and the impact of interconnected systems on resource adequacy. We also provide examples that explore the use of alternative reliability metrics for wind capacity value calculations. We show how multiple-year data sets significantly increase the robustness of results compared to single-year assessments. Assumptions regarding the transmission interconnections play a significant role. To date, results regarding which reliability metric to use for probabilistic capacity valuation show little sensitivity to the metric.

  1. Identifying Useful Statistical Indicators of Proximity to Instability in Stochastic Power Systems

    CERN Document Server

    Ghanavati, Goodarz; Lakoba, Taras I

    2014-01-01

    Prior research has shown that autocorrelation and variance in voltage measurements tend to increase as power systems approach instability. This paper seeks to identify the conditions under which these statistical indicators provide reliable early warning of instability in power systems. First, the paper derives and validates a semi-analytical method for quickly calculating the expected variance and autocorrelation of all voltages and currents in an arbitrary power system model. Building on this approach, the paper describes the conditions under which filtering can be used to detect these signs in the presence of measurement noise. Finally, several experiments show which types of measurements are good indicators of proximity to instability for particular types of state changes. For example, increased variance in voltages can reliably indicate the location of increased stress, while growth of autocorrelation in certain line currents is a reliable indicator of system-wide instability.

  2. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  3. Tools for Assessing Readability of Statistics Teaching Materials

    Science.gov (United States)

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  4. A comprehensive statistical assessment of star-planet interaction

    CERN Document Server

    Miller, Brendan P; Wright, Jason T; Pearson, Elliott G

    2014-01-01

    We investigate whether magnetic interaction between close-in giant planets and their host stars produce observable statistical enhancements in stellar coronal or chromospheric activity. New Chandra observations of 12 nearby (d450 Mjup/AU^2, which here are all X-ray luminous but to a degree commensurate with their Ca II H and K activity, in contrast to presented magnetic star-planet interaction scenarios that predict enhancements relatively larger in Lx. We discuss these results in the context of cumulative tidal spin-up of stars hosting close-in gas giants (potentially followed by planetary infall and destruction). We also test our main-sequence sample for correlations between planetary properties and UV luminosity or Ca II H and K emission, and find no significant dependence.

  5. Statistical assessment of groundwater resources in Washim district (India).

    Science.gov (United States)

    Rajankar, P N; Tambekar, D H; Ramteke, D S; Wate, S R

    2011-01-01

    Groundwater quality of Washim district of Maharashtra (India) was assessed using quality parameters and water quality index (WQI). In this study, the WQI was analyzed by using pH, turbidity, temperature, nitrates, total phosphates, dissolved oxygen, biochemical oxygen demand, total solids, total coliforms and faecal coliforms, respectively for residential and commercial uses. All the parameters were analyzed both in pre-monsoon and post-monsoon seasons to assess the groundwater quality and seasonal variations. The parameters like turbidity, solids and coliforms showed the seasonal variations. The WQI varied from 72 to 88 in pre-monsoon season and 64 to 88 in post-monsoon season. The results indicate that all groundwater samples in the study area have good water quality in pre-monsoon season but in post-monsoon season 9 percent samples indicated the change in water quality from good to medium, which reveals seasonal variation and groundwater quality deterioration.

  6. Impact of dependencies in risk assessments of power distribution systems

    OpenAIRE

    Alvehag, Karin

    2008-01-01

     Society has become increasingly dependent on electricity, so power system reliability is of crucial importance. However, while underinvestment leads to an unacceptable number of power outages, overinvestment will result in costs that are too high for society. The challenge is to find a socioeconomically adequate level of risk. In this risk assessment, not only the probability of power outages, but also the severity of their consequences should be included.   A risk assessment can be performe...

  7. Racialized customer service in restaurants: a quantitative assessment of the statistical discrimination explanatory framework.

    Science.gov (United States)

    Brewster, Zachary W

    2012-01-01

    Despite popular claims that racism and discrimination are no longer salient issues in contemporary society, racial minorities continue to experience disparate treatment in everyday public interactions. The context of full-service restaurants is one such public setting wherein racial minority patrons, African Americans in particular, encounter racial prejudices and discriminate treatment. To further understand the causes of such discriminate treatment within the restaurant context, this article analyzes primary survey data derived from a community sample of servers (N = 200) to assess the explanatory power of one posited explanation—statistical discrimination. Taken as a whole, findings suggest that while a statistical discrimination framework toward understanding variability in servers’ discriminatory behaviors should not be disregarded, the framework’s explanatory utility is limited. Servers’ inferences about the potential profitability of waiting on customers across racial groups explain little of the overall variation in subjects’ self-reported discriminatory behaviors, thus suggesting that other factors not explored in this research are clearly operating and should be the focus of future inquires.

  8. Efficient statistical analysis method of power/ground (P/G) network

    Institute of Scientific and Technical Information of China (English)

    Zuying Luo; Sheldon X.D. Tan

    2008-01-01

    In this paper, we propose an incremental statistical analysis method with complexity reduction as a pre-process for on-chip power/ground (P/G) networks. The new method exploits locality of P/G network analyses and aims at P/G networks with a large number of strongly connected subcircuits (called strong connects) such as trees and chains. The method consists of three steps. First it compresses P/G circuits by removing strong connects. As a result, current variations (CVs) of nodes in strong connects are transferred to some remain-ing nodes. Then based on the locality of power grid voltage responses to its current inputs, it efficiently calculates the correlative resistor (CR) matrix in a local way to directly compute the voltage variations by using small parts of the remaining circuit. Last it statistically recovers voltage variations of the suppressed nodes inside strong connects. This new method for statistically compressing and expanding strong connects in terms of current or voltage variations in a closed form is very efficient owning to its property of incremental analysis. Experimental results demonstrate that the method can efficiently compute low-bounds of voltage variations for P/G networks and it has two or three orders of magnitudes speedup over the traditional Monte-Carlo-based simulation method, with only 2.0% accuracy loss.

  9. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    Science.gov (United States)

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  10. Climate change assessment for Mediterranean agricultural areas by statistical downscaling

    Directory of Open Access Journals (Sweden)

    L. Palatella

    2010-07-01

    Full Text Available In this paper we produce projections of seasonal precipitation for four Mediterranean areas: Apulia region (Italy, Ebro river basin (Spain, Po valley (Italy and Antalya province (Turkey. We performed the statistical downscaling using Canonical Correlation Analysis (CCA in two versions: in one case Principal Component Analysis (PCA filter is applied only to predictor and in the other to both predictor and predictand. After performing a validation test, CCA after PCA filter on both predictor and predictand has been chosen. Sea level pressure (SLP is used as predictor. Downscaling has been carried out for the scenarios A2 and B2 on the basis of three GCM's: the CCCma-GCM2, the Csiro-MK2 and HadCM3. Three consecutive 30-year periods have been considered. For Summer precipitation in Apulia region we also use the 500 hPa temperature (T500 as predictor, obtaining comparable results. Results show different climate change signals in the four areas and confirm the need of an analysis that is capable of resolving internal differences within the Mediterranean region. The most robust signal is the reduction of Summer precipitation in the Ebro river basin. Other significative results are the increase of precipitation over Apulia in Summer, the reduction over the Po-valley in Spring and Autumn and the increase over the Antalya province in Summer and Autumn.

  11. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  12. Statistical assessment of trophic conditions: squared Euclidean distance approach

    Directory of Open Access Journals (Sweden)

    Chatchai Ratanachai

    2003-05-01

    Full Text Available The classification of trophic conditions of water bodies may often face contradictory cases where a given lake is classified into a trophic category from a trophic variable, whereas it is classified into another trophic category from other trophic variables. To solve this problem, this paper proposes a new methodology based on the concepts of squared Euclidean distance and the boundary values recommended by the OECD (Organization for Economic Cooperation and Development. This methodology requires that a trophic variable data set of a water body under consideration and such boundary values be compared by a measure of similarity computed by using basic statistical techniques to determine the trophic condition of a given water body. The methodology has been tested by applying it to two sample data sets: the Pattani Dam Reservoir and the North Adriatic Sea data sets, which were taken from Kietpawpan (2002 and Zurlini (1996, respectively. The squared Euclidean distance analysis were then applied to the above data sets in order to classifytrophic conditions, based on four trophic variables comprising total nitrogen, total phosphorus, chlorophylla, and Secchi depth. Our results show that the squared Euclidean distance analysis is a useful methodology for preliminarily classifying trophic conditions and solving contradictory classifications, which often arise when applying the present OECD methodology alone.

  13. Assessing heart rate variability through wavelet-based statistical measures.

    Science.gov (United States)

    Wachowiak, Mark P; Hay, Dean C; Johnson, Michel J

    2016-10-01

    Because of its utility in the investigation and diagnosis of clinical abnormalities, heart rate variability (HRV) has been quantified with both time and frequency analysis tools. Recently, time-frequency methods, especially wavelet transforms, have been applied to HRV. In the current study, a complementary computational approach is proposed wherein continuous wavelet transforms are applied directly to ECG signals to quantify time-varying frequency changes in the lower bands. Such variations are compared for resting and lower body negative pressure (LBNP) conditions using statistical and information-theoretic measures, and compared with standard HRV metrics. The latter confirm the expected lower variability in the LBNP condition due to sympathetic nerve activity (e.g. RMSSD: p=0.023; SDSD: p=0.023; LF/HF: p=0.018). Conversely, using the standard Morlet wavelet and a new transform based on windowed complex sinusoids, wavelet analysis of the ECG within the observed range of heart rate (0.5-1.25Hz) exhibits significantly higher variability, as measured by frequency band roughness (Morlet CWT: p=0.041), entropy (Morlet CWT: p=0.001), and approximate entropy (Morlet CWT: p=0.004). Consequently, this paper proposes that, when used with well-established HRV approaches, time-frequency analysis of ECG can provide additional insights into the complex phenomenon of heart rate variability.

  14. WATER POLO GAME-RELATED STATISTICS IN WOMEN'S INTERNATIONAL CHAMPIONSHIPS: DIFFERENCES AND DISCRIMINATORY POWER

    Directory of Open Access Journals (Sweden)

    Yolanda Escalante

    2012-09-01

    Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots

  15. A new non-invasive statistical method to assess the spontaneous cardiac baroreflex in humans.

    Science.gov (United States)

    Ducher, M; Fauvel, J P; Gustin, M P; Cerutti, C; Najem, R; Cuisinaud, G; Laville, M; Pozet, N; Paultre, C Z

    1995-06-01

    1. A new method was developed to evaluate cardiac baroreflex sensitivity. The association of a high systolic blood pressure with a low heart rate or the converse is considered to be under the influence of cardiac baroreflex activity. This method is based on the determination of the statistical dependence between systolic blood pressure and heart rate values obtained non-invasively by a Finapres device. Our computerized analysis selects the associations with the highest statistical dependence. A 'Z-coefficient' quantifies the strength of the statistical dependence. The slope of the linear regression, computed on these selected associations, is used to estimate baroreflex sensitivity. 2. The present study was carried out in 11 healthy resting male subjects. The results obtained by the 'Z-coefficient' method were compared with those obtained by cross-spectrum analysis, which has already been validated in humans. Furthermore, the reproducibility of both methods was checked after 1 week. 3. The results obtained by the two methods were significantly correlated (r = 0.78 for the first and r = 0.76 for the second experiment, P < 0.01). When repeated after 1 week, the average results were not significantly different. Considering individual results, test-retest correlation coefficients were higher with the Z-analysis (r = 0.79, P < 0.01) than with the cross-spectrum analysis (r = 0.61, P < 0.05). 4. In conclusion, as the Z-method gives results similar to but more reproducible than the cross-spectrum method, it might be a powerful and reliable tool to assess baroreflex sensitivity in humans.

  16. Statistics of power injection in a plate set into chaotic vibration

    Science.gov (United States)

    Cadot, O.; Boudaoud, A.; Touzé, C.

    2008-12-01

    A vibrating plate is set into a chaotic state of wave turbulence by either a periodic or a random local forcing. Correlations between the forcing and the local velocity response of the plate at the forcing point are studied. Statistical models with fairly good agreement with the experiments are proposed for each forcing. Both distributions of injected power have a logarithmic cusp for zero power, while the tails are Gaussian for the periodic driving and exponential for the random one. The distributions of injected work over long time intervals are investigated in the framework of the fluctuation theorem, also known as the Gallavotti-Cohen theorem. It appears that the conclusions of the theorem are verified only for the periodic, deterministic forcing. Using independent estimates of the phase space contraction, this result is discussed in the light of available theoretical framework.

  17. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd;

    2009-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...... of series of prediction errors to a multivariate Gaussian random variable, the interdependence structure of which can then be summarized by a unique covariance matrix. Such matrix is recursively estimated in order to accommodate long-term variations in the prediction error characteristics. The quality...

  18. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Ranking....... By use of statistical power analyses and demonstration of effect sizes, we emphasize that importance of empirical findings lies in “differences that make a difference” and not statistical significance tests per se. Finally we discuss the crucial assumption of randomness and question the presumption...... that randomness is present in the university ranking data. We conclude that the application of statistical significance tests in relation to university rankings, as recently advocated, is problematic and can be misleading....

  19. Unbiased Group-Level Statistical Assessment of Independent Component Maps by Means of Automated Retrospective Matching

    NARCIS (Netherlands)

    Langers, Dave R. M.

    2010-01-01

    This report presents and validates a method for the group-level statistical assessment of independent component analysis (ICA) outcomes. The method is based on a matching of individual component maps to corresponding aggregate maps that are obtained from concatenated data. Group-level statistics are

  20. Assessing the performance of statistical validation tools for megavariate metabolomics data

    NARCIS (Netherlands)

    Rubingh, C.M.; Bijlsma, S.; Derks, E.P.P.A.; Bobeldijk, I.; Verheij, E.R.; Kochhar, S.; Smilde, A.K.

    2006-01-01

    Statistical model validation tools such as cross-validation, jack-knifing model parameters and permutation tests are meant to obtain an objective assessment of the performance and stability of a statistical model. However, little is known about the performance of these tools for megavariate data set

  1. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  2. Classification of Underlying Causes of Power Quality Disturbances: Deterministic versus Statistical Methods

    Directory of Open Access Journals (Sweden)

    Emmanouil Styvaktakis

    2007-01-01

    Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.

  3. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    Science.gov (United States)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  4. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  5. Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests

    Science.gov (United States)

    Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.

    2015-01-01

    The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.

  6. A statistical survey of ultralow-frequency wave power and polarization in the Hermean magnetosphere.

    Science.gov (United States)

    James, Matthew K; Bunce, Emma J; Yeoman, Timothy K; Imber, Suzanne M; Korth, Haje

    2016-09-01

    We present a statistical survey of ultralow-frequency wave activity within the Hermean magnetosphere using the entire MErcury Surface, Space ENvironment, GEochemistry, and Ranging magnetometer data set. This study is focused upon wave activity with frequencies Wave activity is mapped to the magnetic equatorial plane of the magnetosphere and to magnetic latitude and local times on Mercury using the KT14 magnetic field model. Wave power mapped to the planetary surface indicates the average location of the polar cap boundary. Compressional wave power is dominant throughout most of the magnetosphere, while azimuthal wave power close to the dayside magnetopause provides evidence that interactions between the magnetosheath and the magnetopause such as the Kelvin-Helmholtz instability may be driving wave activity. Further evidence of this is found in the average wave polarization: left-handed polarized waves dominate the dawnside magnetosphere, while right-handed polarized waves dominate the duskside. A possible field line resonance event is also presented, where a time-of-flight calculation is used to provide an estimated local plasma mass density of ∼240 amu cm(-3).

  7. Statistical analysis of regional capital and operating costs for electric power generation

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, L.R.; Myers, M.G.; Herrman, J.A.; Provanizano, A.J.

    1977-10-01

    This report presents the results of a three and one-half-month study conducted for Brookhaven National Lab. to develop capital and operating cost relationships for seven electric power generating technologies: oil-, coal-, gas-, and nuclear-fired steam-electric plants, hydroelectric plants, and gas-turbine plants. The methodology is based primarily on statistical analysis of Federal Power Commission data for plant construction and annual operating costs. The development of cost-output relationships for electric power generation is emphasized, considering the effects of scale, technology, and location on each of the generating processes investigated. The regional effects on cost are measured at the Census Region level to be consistent with the Brookhaven Multi-Regional Energy and Interindustry Regional Model of the United States. Preliminary cost relationships for system-wide costs - transmission, distribution, and general expenses - were also derived. These preliminary results cover the demand for transmission and distribution capacity and operating and maintenance costs in terms of system-service characteristics. 15 references, 6 figures, 23 tables.

  8. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    Science.gov (United States)

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  9. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    Science.gov (United States)

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  10. Transient Stability Assessment of Power System with Large Amount of Wind Power Penetration

    DEFF Research Database (Denmark)

    Liu, Leo; Chen, Zhe; Bak, Claus Leth;

    2012-01-01

    ) in different scenarios is proposed to evaluate the vulnerable areas in western Danish power system. The result of CCTs in different scenarios can evaluate the impact of wind power on power system transient stability. Besides, some other influencing factors such as the load level of generators in central power......Recently, the security and stability of power system with large amount of wind power are the concerned issues, especially the transient stability. In Denmark, the onshore and offshore wind farms are connected to distribution system and transmission system respectively. The control and protection...... methodologies of onshore and offshore wind farms definitely affect the transient stability of power system. In this paper, the onshore and offshore wind farms are modeled in detail in order to assess the transient stability of western Danish power system. Further, the computation of critical clearing time (CCT...

  11. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  12. Statistical power calculation and sample size determination for environmental studies with data below detection limits

    Science.gov (United States)

    Shao, Quanxi; Wang, You-Gan

    2009-09-01

    Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.

  13. Secure Wireless Communication and Optimal Power Control under Statistical Queueing Constraints

    CERN Document Server

    Qiao, Deli; Velipasalar, Senem

    2010-01-01

    In this paper, secure transmission of information over fading broadcast channels is studied in the presence of statistical queueing constraints. Effective capacity is employed as a performance metric to identify the secure throughput of the system, i.e., effective secure throughput. It is assumed that perfect channel side information (CSI) is available at both the transmitter and the receivers. Initially, the scenario in which the transmitter sends common messages to two receivers and confidential messages to one receiver is considered. For this case, effective secure throughput region, which is the region of constant arrival rates of common and confidential messages that can be supported by the buffer-constrained transmitter and fading broadcast channel, is defined. It is proven that this effective throughput region is convex. Then, the optimal power control policies that achieve the boundary points of the effective secure throughput region are investigated and an algorithm for the numerical computation of t...

  14. Generation of statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd;

    2007-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...... generation. The approach is evaluated on the test case of a multi-MW wind farm over a period of more than two years. Its interest for a large range of applications is discussed....

  15. Assessment of Electric Power Quality in Ships'Modern Systems

    Institute of Scientific and Technical Information of China (English)

    Janusz Mindykowski; XU Xiao-yan

    2004-01-01

    The paper deals with the selected problems of electric power quality in ships'modern systems.In the introduction the fundamentals of electric power quality assessment,such as the relations and consequences among power quality phenomena and indices,secondly as the methods and tools as well as the appropriate instrumentation,have been shortly presented.Afterwards,the basic characteristic of power systems on modern ships has been given.The main focus of the paper is put on the assessment of electric power quality in ships'systems fitted with converter subsystems.The state of the art and actual tendencies in the discussed matter have been shown.Some chosen experimental results,based on the research carried out under supervision of the author,have been presented,too.Finally,some concluding issues have been shortly commented on.

  16. Power Transmission System Vulnerability Assessment Using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    E. Karimi

    2012-11-01

    Full Text Available Recent blackouts in power systems have shown the necessity of vulnerability assessment. Among all factors, transmission system components have a more important role. Power system vulnerability assessment could capture cascading outages which result in large blackouts and is an effective tool for power system engineers for defining power system bottlenecks and weak points. In this paper a new method based on fault chains concept is developed which uses new measures. Genetic algorithm with an effective structure is used for finding vulnerable branches in a practical power transmission system. Analytic hierarchy process is a technique used to determine the weighting factors in fitness function of genetic algorithm. Finally, the numerical results for Isfahan Regional Electric Company are presented which verifies the effectiveness and precision of the proposed method according to the practical expriments.

  17. Assessment of a satellite power system and six alternative technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L.S.; Levine, E.; Tanzman, E.

    1981-04-01

    The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and insitutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included. (LEW)

  18. Statistical connection of peak counts to power spectrum and moments in weak-lensing field

    Science.gov (United States)

    Shirasaki, Masato

    2017-02-01

    The number density of local maxima of weak-lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak-lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field K to a new Gaussian field y, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of K can be reproduced from a single Gaussian field y and monotonic relation between y and K. Therefore, the correct information of two-point clustering and any order of moments in weak-lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to explain weak-lensing peak counts in the absence of shape noise. The prediction by local-Gaussianized transformation underestimates the simulated peak counts with a level of ˜20-30 per cent over a wide range of peak heights. Local-Gaussianized transformation can predict the weak-lensing peak counts with an ˜10 per cent accuracy in the presence of shape noise. Our analyses suggest that the cosmological information beyond power spectrum and its moments would be necessary to predict the weak-lensing peak counts with a percent-level accuracy, which is an expected statistical uncertainty in upcoming wide-field galaxy surveys.

  19. Skew-Laplace and Cell-Size Distribution in Microbial Axenic Cultures: Statistical Assessment and Biological Interpretation

    Directory of Open Access Journals (Sweden)

    Olga Julià

    2010-01-01

    Full Text Available We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting.

  20. Robust Statistical Tests of Dragon-Kings beyond Power Law Distributions

    CERN Document Server

    Pisarenko, V F

    2011-01-01

    We ask the question whether it is possible to diagnose the existence of "Dragon-Kings" (DK), namely anomalous observations compared to a power law background distribution of event sizes. We present two new statistical tests, the U-test and the DK-test, aimed at identifying the existence of even a single anomalous event in the tail of the distribution of just a few tens of observations. The DK-test in particular is derived such that the p-value of its statistic is independent of the exponent characterizing the null hypothesis. We demonstrate how to apply these two tests on the distributions of cities and of agglomerations in a number of countries. We find the following evidence for Dragon-Kings: London in the distribution of city sizes of Great Britain; Moscow and St-Petersburg in the distribution of city sizes in the Russian Federation; and Paris in the distribution of agglomeration sizes in France. True negatives are also reported, for instance the absence of Dragon-Kings in the distribution of cities in Ger...

  1. Detecting temporal change in freshwater fisheries surveys: statistical power and the important linkages between management questions and monitoring objectives

    Science.gov (United States)

    Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,

    2016-01-01

    Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.

  2. Probabilistic safety assessment for optimum nuclear power plant life management (PLiM) theory and application of reliability analysis methods for major power plant components

    CERN Document Server

    Arkadov, G V; Rodionov, A N

    2012-01-01

    Probabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Directing preventative maintenance, this title provides a comprehensive review of the theory and application of these methods.$bProbabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Successful calculation of the reliability and ageing of components is critical for forecasting safety and directing preventative maintenance, and Probabilistic safety assessment for optimum nuclear power plant life management provides a comprehensive review of the theory and application of these methods. Part one reviews probabilistic methods for predicting the reliability of equipment. Following an introduction to key terminology, concepts and definitions, formal-statistical and various physico-statistical approaches are discussed. Approaches based on the use of defect-free models are considered, along with those using binomial distribution and models bas...

  3. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  4. National-Scale Wind Resource Assessment for Power Generation (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, E. I.

    2013-08-01

    This presentation describes the current standards for conducting a national-scale wind resource assessment for power generation, along with the risk/benefit considerations to be considered when beginning a wind resource assessment. The presentation describes changes in turbine technology and viable wind deployment due to more modern turbine technology and taller towers and shows how the Philippines national wind resource assessment evolved over time to reflect changes that arise from updated technologies and taller towers.

  5. Statistical analysis of wind power in the region of Veracruz (Mexico)

    Energy Technology Data Exchange (ETDEWEB)

    Cancino-Solorzano, Yoreley [Departamento de Ing Electrica-Electronica, Instituto Tecnologico de Veracruz, Calzada Miguel A. de Quevedo 2779, 91860 Veracruz (Mexico); Xiberta-Bernat, Jorge [Departamento de Energia, Escuela Tecnica Superior de Ingenieros de Minas, Universidad de Oviedo, C/Independencia, 13, 2a Planta, 33004 Oviedo (Spain)

    2009-06-15

    The capacity of the Mexican electricity sector faces the challenge of satisfying the demand of the 80 GW forecast by 2016. This value supposes a steady yearly average increase of some 4.9%. The electricity sector increases for the next eight years will be mainly made up of combined cycle power plants which could be a threat to the energy supply of the country due to the fact that the country is not self-sufficient in natural gas. As an alternative wind energy resource could be a more suitable option compared with combined cycle power plants. This option is backed by market trends indicating that wind technology costs will continue to decrease in the near future as has happened in recent years. Evaluation of the eolic potential in different areas of the country must be carried out in order to achieve the best use possible of this option. This paper gives a statistical analysis of the wind characteristics in the region of Veracruz. The daily, monthly and annual wind speed values have been studied together with their prevailing direction. The data analyzed correspond to five meteorological stations and two anemometric stations located in the aforementioned area. (author)

  6. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hacke, P.; Spataru, S.

    2014-08-01

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.

  7. Assessment of alternative power sources for mobile mining machinery

    Science.gov (United States)

    Cairelli, J. E.; Tomazic, W. A.; Evans, D. G.; Klann, J. L.

    1981-12-01

    Alternative mobile power sources for mining applications were assessed. A wide variety of heat engines and energy systems was examined as potential alternatives to presently used power systems. The present mobile power systems are electrical trailing cable, electrical battery, and diesel - with diesel being largely limited in the United States to noncoal mines. Each candidate power source was evaluated for the following requirements: (1) ability to achieve the duty cycle; (2) ability to meet Government regulations; (3) availability (production readiness); (4) market availability; and (5) packaging capability. Screening reduced the list of candidates to the following power sources: diesel, stirling, gas turbine, rankine (steam), advanced electric (batteries), mechanical energy storage (flywheel), and use of hydrogen evolved from metal hydrides. This list of candidates is divided into two classes of alternative power sources for mining applications, heat engines and energy storage systems.

  8. Satellite Power System (SPS) societal assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-12-01

    Construction and operation of a 60-unit (300 GW) domestic SPS over the period 2000 to 2030 would stress many segments of US society. A significant commitment of resources (land, energy, materials) would be required, and a substantial proportion of them would have to be committed prior to the production of any SPS electricity. Estimated resource demands, however, seem to be within US capabilities. Modifications will be required of institutions called upon to deal with SPS. These include financial, managerial and regulatory entities and, most particularly, the utility industry. Again, the required changes, while certainly profound, seem to be well within the realm of possibility. Enhanced cooperation in international affairs will be necessary to accommodate development and operation of the SPS. To remove its potential as a military threat and to reduce its vulnerability, either the SPS itself must become an international enterprise, or it must be subject to unrestricted international inspection. How either of these objectives could, in fact, be achieved, or which is preferable, remains unclear. Forty-four concerns about the SPS were identified via a public outreach experiment involving 9000 individuals from three special interest organizations. The concerns focused on environmental impacts (particularly the effects of microwave radiation) and the centralizing tendency of the SPS on society. The interim results of the public outreach experiment influenced the scope and direction of the CDEP; the final results will be instrumental in defining further societal assessment efforts.

  9. Evaluation of a regional monitoring program's statistical power to detect temporal trends in forest health indicators

    Science.gov (United States)

    Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.

    2014-01-01

    Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.

  10. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power.

  11. Transient stability risk assessment of power systems incorporating wind farms

    DEFF Research Database (Denmark)

    Miao, Lu; Fang, Jiakun; Wen, Jinyu;

    2013-01-01

    Large-scale wind farm integration has brought several aspects of challenges to the transient stability of power systems. This paper focuses on the research of the transient stability of power systems incorporating with wind farms by utilizing risk assessment methods. The detailed model of double...... fed induction generator has been established. Wind penetration variation and multiple stochastic factors of power systems have been considered. The process of transient stability risk assessment based on the Monte Carlo method has been described and a comprehensive risk indicator has been proposed....... An investigation has been conducted into an improved 10-generator 39-bus system with a wind farm incorporated to verify the validity and feasibility of the risk assessment method proposed....

  12. On-line Dynamic Security Assessment in Power Systems

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel

    tools may no longer be feasible, since they are generally based on extensive off-line studies. A core component of an efficient on-line dynamic security assessment is a fast and reliable contingency screening. As part of this thesis a contingency screening method is developed and its performance......The thesis concerns the development of tools and methods for on-line dynamic security assessment (DSA). In a future power system with low-dependence or even independence of fossil fuels, generation will be based to a large extent on noncontrollable renewable energy sources (RES), such as wind...... and solar radiation. Moreover, ongoing research suggests that demand response will be introduced to maintain power balance between generation and consumption at all times. Due to these changes the operating point of the power system will be less predictable and today’s stability and security assessment...

  13. Assessment of novel power generation systems for the biomass industry

    OpenAIRE

    Codeceira Neto, Alcides

    1999-01-01

    The objective of this programme of research is to produce a method for assessing and optimising the performance of advanced gas turbine power plants for electricity generation within the Brazilian electric sector. With the privatisation of the Brazilian electric sector, interest has been given to the thermal plants and studies have been carried out along with the use of other alternative fuels rather than fossil fuels. Biomass is a fuel of increasing interest for power gener...

  14. Life cycle assessment analysis of supercritical coal power units

    Science.gov (United States)

    Ziębik, Andrzej; Hoinka, Krzysztof; Liszka, Marcin

    2010-09-01

    This paper presents the Life Cycle Assessment (LCA) analysis concerning the selected options of supercritical coal power units. The investigation covers a pulverized power unit without a CCS (Carbon Capture and Storage) installation, a pulverized unit with a "post-combustion" installation (MEA type) and a pulverized power unit working in the "oxy-combustion" mode. For each variant the net electric power amounts to 600 MW. The energy component of the LCA analysis has been determined. It describes the depletion of non-renewable natural resources. The energy component is determined by the coefficient of cumulative energy consumption in the life cycle. For the calculation of the ecological component of the LCA analysis the cumulative CO2 emission has been applied. At present it is the basic emission factor for the LCA analysis of power plants. The work also presents the sensitivity analysis of calculated energy and ecological factors.

  15. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  16. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  17. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.

    2015-01-01

    A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from...... the kernel matrix, called similarity index, which is introduced by the authors for the first time. The operation of the turbine and consequently the computation of the similarity indexes is classified into five power bins offering better resolution and thus more consistent root cause analysis. The accurate...

  18. Developing a PQ monitoring system for assessing power quality and critical areas detection

    Directory of Open Access Journals (Sweden)

    Miguel Romero

    2011-10-01

    Full Text Available This paper outlines the development of a power quality monitoring system. The system is aimed at assessing power quality and detecting critical areas throughout at distribution system. Such system integrates a hardware system and a software processing tool developed in four main stages. Power quality disturbances are registered by PQ meters and the data is transmitted through a 3G wireless network. This data is processed and filtered in an open source database. Some interesting statistical indices related to voltage sags, swells, flicker and voltage unbalance are obtained. The last stage displays the indices geo-referenced on power quality maps, allowing the identification of critical areas according to different criteria. The results can be analyzed using clustering tools to identify differentiated quality groups in a city. The proposed system is an open source tool useful to electricity utilities to analyze and manage large amount of data.

  19. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  20. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics

    Science.gov (United States)

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...

  1. Business Statistics and Management Science Online: Teaching Strategies and Assessment of Student Learning

    Science.gov (United States)

    Sebastianelli, Rose; Tamimi, Nabil

    2011-01-01

    Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…

  2. Statistical connection of peak counts to power spectrum and moments in weak lensing field

    CERN Document Server

    Shirasaki, Masato

    2016-01-01

    The number density of local maxima of weak lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field $\\cal K$ to a new Gaussian field $y$, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of $\\cal K$ can be reproduced from a single Gaussian field $y$ and monotonic relation between $y$ and $\\cal K$. Therefore, the correct information of two-point clustering and any order of moments in weak lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to ...

  3. Dynamic security risk assessment and optimization of power transmission system

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The paper presents a practical dynamic security region (PDSR) based dynamic security risk assessment and optimization model for power transmission system. The cost of comprehensive security control and the influence of uncertainties of power injections are considered in the model of dynamic security risk assessment. The transient stability constraints and uncertainties of power injections can be considered easily by PDSR in form of hyper-box. A method to define and classify contingency set is presented, and a risk control optimization model is given which takes total dynamic insecurity risk as the objective function for a dominant con-tingency set. An optimal solution of dynamic insecurity risk is obtained by opti-mizing preventive and emergency control cost and contingency set decomposition. The effectiveness of this model has been proved by test results on the New Eng-land 10-genarator 39-bus system.

  4. Statistical model specification and power: recommendations on the use of test-qualified pooling in analysis of experimental data.

    Science.gov (United States)

    Colegrave, Nick; Ruxton, Graeme D

    2017-03-29

    A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure.

  5. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  6. Statistics of the Chi-Square Type, with Application to the Analysis of Multiple Time-Series Power Spectra

    CERN Document Server

    Sturrock, P A

    2003-01-01

    It is often necessary to compare the power spectra of two or more time series: one may, for instance, wish to estimate what the power spectrum of the combined data sets might have been, or one may wish to estimate the significance of a particular peak that shows up in two or more power spectra. Also, one may occasionally need to search for a complex of peaks in a single power spectrum, such as a fundamental and one or more harmonics, or a fundamental plus sidebands, etc. Visual inspection can be revealing, but it can also be misleading. This leads one to look for one or more ways of forming statistics, which readily lend themselves to significance estimation, from two or more power spectra. The familiar chi-square statistic provides a convenient mechanism for combining variables drawn from normal distributions, and one may generalize the chi-square statistic to be any function of any number of variables with arbitrary distributions. In dealing with power spectra, we are interested mainly in exponential distri...

  7. The significance of structural power in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Hansen, Anne Merrild; Kørnøv, Lone; Cashmore, Matthew Asa;

    2013-01-01

    This article presents a study of how power dynamics enables and constrains the influence of actors upon decision-making and Strategic Environmental Assessment (SEA). Based on Anthony Giddens structuration theory (ST), a model for studying power dynamics in strategic decision-making processes......, that actors influence both outcome and frames for strategic decision making and attention needs to be on not only the formal interactions between SEA process and strategic decision-making process but also on informal interaction and communication between actors. The informal structures shows crucial...... to the outcome of the decision-making process. The article is meant as a supplement to the understanding of power dynamics influence in IA processes emphasising the capacity of agents to mobilise and create change. Despite epistemological challenges of using ST theory as an approach to power analysis, this meta...

  8. Steady state security assessment in deregulated power systems

    Science.gov (United States)

    Manjure, Durgesh Padmakar

    Power system operations are undergoing changes, brought about primarily due to deregulation and subsequent restructuring of the power industry. The primary intention of the introduction of deregulation in power systems was to bring about competition and improved customer focus. The underlying motive was increased economic benefit. Present day power system analysis is much different than what it was earlier, essentially due to the transformation of the power industry from being cost-based to one that is price-based and due to open access of transmission networks to the various market participants. Power is now treated as a commodity and is traded in an open market. The resultant interdependence of the technical criteria and the economic considerations has only accentuated the need for accurate analysis in power systems. The main impetus in security analysis studies is on efficient assessment of the post-contingency status of the system, accuracy being of secondary consideration. In most cases, given the time frame involved, it is not feasible to run a complete AC load flow for determining the post-contingency state of the system. Quite often, it is not warranted as well, as an indication of the state of the system is desired rather than the exact quantification of the various state variables. With the inception of deregulation, transmission networks are subjected to a host of multilateral transactions, which would influence physical system quantities like real power flows, security margins and voltage levels. For efficient asset utilization and maximization of the revenue, more often than not, transmission networks are operated under stressed conditions, close to security limits. Therefore, a quantitative assessment of the extent to which each transaction adversely affects the transmission network is required. This needs to be done accurately as the feasibility of the power transactions and subsequent decisions (execution, curtailment, pricing) would depend upon the

  9. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  10. The Statistical Analysis and Assessment of the Solvency of Forest Enterprises

    Directory of Open Access Journals (Sweden)

    Vyniatynska Liudmila V.

    2016-05-01

    Full Text Available The aim of the article is to conduct a statistical analysis of the solvency of forest enterprises through a system of statistical indicators using the sampling method (the sampling is based on the criteria of forest cover percent of regions of Ukraine. Using financial statements of forest enterprises that form a system of information and analytical support for the statistical analysis of the level of solvency of forestry in Ukraine for 2009-2015 has been analyzed and evaluated. With the help of the developed recommended values the results of the statistical analysis of the forest enterprises’ solvency under conditions of self-financing and commercial consideration have been summarized and systematized. Using the methodology of the statistical analysis of the forest enterprises’ solvency conducted on the corresponding conceptual framework, which is relevant and meets the current needs, a system of statistical indicators enabling to assess the level of solvency of forest enterprises and identify the reasons of its low level has been calculated.

  11. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  12. A Sample Selection Strategy to Boost the Statistical Power of Signature Detection in Cancer Expression Profile Studies

    NARCIS (Netherlands)

    Jia, Zhenyu; Wang, Yipeng; Hu, Yuanjie; McLaren, Christine; Yu, Yingyan; Ye, Kai; Xia, Xiao-Qin; Koziol, James A.; Lernhardt, Waldemar; McClelland, Michael; Mercola, Dan

    2013-01-01

    In case-control profiling studies, increasing the sample size does not always improve statistical power because the variance may also be increased if samples are highly heterogeneous. For instance, tumor samples used for gene expression assay are often heterogeneous in terms of tissue composition or

  13. Mathematical Safety Assessment Approaches for Thermal Power Plants

    Directory of Open Access Journals (Sweden)

    Zong-Xiao Yang

    2014-01-01

    Full Text Available How to use system analysis methods to identify the hazards in the industrialized process, working environment, and production management for complex industrial processes, such as thermal power plants, is one of the challenges in the systems engineering. A mathematical system safety assessment model is proposed for thermal power plants in this paper by integrating fuzzy analytical hierarchy process, set pair analysis, and system functionality analysis. In the basis of those, the key factors influencing the thermal power plant safety are analyzed. The influence factors are determined based on fuzzy analytical hierarchy process. The connection degree among the factors is obtained by set pair analysis. The system safety preponderant function is constructed through system functionality analysis for inherence properties and nonlinear influence. The decision analysis system is developed by using active server page technology, web resource integration, and cross-platform capabilities for applications to the industrialized process. The availability of proposed safety assessment approach is verified by using an actual thermal power plant, which has improved the enforceability and predictability in enterprise safety assessment.

  14. The intermediates take it all: asymptotics of higher criticism statistics and a powerful alternative based on equal local levels.

    Science.gov (United States)

    Gontscharuk, Veronika; Landwehr, Sandra; Finner, Helmut

    2015-01-01

    The higher criticism (HC) statistic, which can be seen as a normalized version of the famous Kolmogorov-Smirnov statistic, has a long history, dating back to the mid seventies. Originally, HC statistics were used in connection with goodness of fit (GOF) tests but they recently gained some attention in the context of testing the global null hypothesis in high dimensional data. The continuing interest for HC seems to be inspired by a series of nice asymptotic properties related to this statistic. For example, unlike Kolmogorov-Smirnov tests, GOF tests based on the HC statistic are known to be asymptotically sensitive in the moderate tails, hence it is favorably applied for detecting the presence of signals in sparse mixture models. However, some questions around the asymptotic behavior of the HC statistic are still open. We focus on two of them, namely, why a specific intermediate range is crucial for GOF tests based on the HC statistic and why the convergence of the HC distribution to the limiting one is extremely slow. Moreover, the inconsistency in the asymptotic and finite behavior of the HC statistic prompts us to provide a new HC test that has better finite properties than the original HC test while showing the same asymptotics. This test is motivated by the asymptotic behavior of the so-called local levels related to the original HC test. By means of numerical calculations and simulations we show that the new HC test is typically more powerful than the original HC test in normal mixture models.

  15. Real-time dynamic security assessment of power grids

    Science.gov (United States)

    Kerin, Uros; Heyde, Chris; Krebs, Rainer; Lerch, Edwin

    2014-10-01

    This paper presents a dynamic security assessment solution, which can be used in the power system control room to improve system stability. It is based on a set of security indices. The indices are able of establishing contingencies' severity levels as a measure of different aspects of power system security. A system based on fuzzy logic is used to combine the indices into a single composite index. The composite index is able to alert the control operator to the network conditions that represent a significant risk to system security based on over-all system performance.

  16. A systems assessment of the five Starlite tokamak power plants

    Energy Technology Data Exchange (ETDEWEB)

    Bathke, C.G.

    1996-07-01

    The ARIES team has assessed the power-plant attractiveness of the following five tokamak physics regimes: (1) steady state, first stability regime; (2) pulsed, first stability regime; (3) steady state, second stability regime; (4) steady state, reversed shear; and (5) steady state, low aspect ratio. Cost-based systems analysis of these five tokamak physics regimes suggests that an electric power plant based upon a reversed-shear tokamak is significantly more economical than one based on any of the other four physics regimes. Details of this comparative systems analysis are described herein.

  17. Identifying potentially induced seismicity and assessing statistical significance in Oklahoma and California

    CERN Document Server

    McClure, Mark; Chiu, Kitkwan; Ranganath, Rajesh

    2016-01-01

    In this study, we develop a statistical method for identifying induced seismicity from large datasets and apply the method to decades of wastewater disposal and seismicity data in California and Oklahoma. The method is robust against a variety of potential pitfalls. The study regions are divided into gridblocks. We use a longitudinal study design, seeking associations between seismicity and wastewater injection along time-series within each gridblock. The longitudinal design helps control for non-random application of wastewater injection. We define a statistical model that is flexible enough to describe the seismicity observations, which have temporal correlation and high kurtosis. In each gridblock, we find the maximum likelihood estimate for a model parameter that relates induced seismicity hazard to total volume of wastewater injected each year. To assess significance, we compute likelihood ratio test statistics in each gridblock and each state, California and Oklahoma. Resampling is used to empirically d...

  18. Study of creep cavity growth for power plant lifetime assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wu Rui; Sandstroem, Rolf

    2001-01-01

    This report aims to the sub project lifetime assessment by creep (livslaengdspredikteringar vid kryp), which is involved in the project package strength in high temperature power plant, KME 708. The physical creep damage includes mainly cavities and their development. Wu and Sandstroem have observed that cavity size increases linearly with increasing creep strain in a 12%Cr steel. Sandstroem has showed that, based on the relations between the nucleation and growth of creep cavities with creep strain, the physical creep damage can be modelled as a function of creep strain. In the present paper the growth of creep cavity radius R in relation to time t and strain {epsilon} in low alloy and 12%Cr steels as well as a Type 347 steel has been studied. The results exhibit that the power law cavity radius with creep time (R-t) and with creep strain (R-{epsilon}) relations are found for these materials at various testing conditions. The power law R-t and R-{epsilon} relations are in most cases dependent and independent on testing conditions, respectively. The empirical power law R-{epsilon} relations give a description of cavity evolution, which can be used for lifetime assessment. Experimental data have also been compared to the estimations by the classical models for cavity growth, including the power law growth due to Hancock, the diffusion growth due to Speight and Harris, the constrained diffusion growths due to Dyson and due to Rice and the enhanced diffusion growth due to Beere. It appears that the constraint diffusion growth models give a reasonable estimation of R-{epsilon} relation in many cases. The diffusion growth model is only applicable for limited cases where the power over t in R-t relation takes about 1/3. The power law and the enhanced diffusion models are found in most cases to overestimate the cavity growth.

  19. Multivariate Statistical Process Control and Case-Based Reasoning for situation assessment of Sequencing Batch Reactors

    OpenAIRE

    Ruiz Ordóñez, Magda Liliana

    2008-01-01

    ABSRACTThis thesis focuses on the monitoring, fault detection and diagnosis of Wastewater Treatment Plants (WWTP), which are important fields of research for a wide range of engineering disciplines. The main objective is to evaluate and apply a novel artificial intelligent methodology based on situation assessment for monitoring and diagnosis of Sequencing Batch Reactor (SBR) operation. To this end, Multivariate Statistical Process Control (MSPC) in combination with Case-Based Reasoning (CBR)...

  20. Blind image quality assessment using statistical independence in the divisive normalization transform domain

    Science.gov (United States)

    Chu, Ying; Mou, Xuanqin; Fu, Hong; Ji, Zhen

    2015-11-01

    We present a general purpose blind image quality assessment (IQA) method using the statistical independence hidden in the joint distributions of divisive normalization transform (DNT) representations for natural images. The DNT simulates the redundancy reduction process of the human visual system and has good statistical independence for natural undistorted images; meanwhile, this statistical independence changes as the images suffer from distortion. Inspired by this, we investigate the changes in statistical independence between neighboring DNT outputs across the space and scale for distorted images and propose an independence uncertainty index as a blind IQA (BIQA) feature to measure the image changes. The extracted features are then fed into a regression model to predict the image quality. The proposed BIQA metric is called statistical independence (STAIND). We evaluated STAIND on five public databases: LIVE, CSIQ, TID2013, IRCCyN/IVC Art IQA, and intentionally blurred background images. The performances are relatively high for both single- and cross-database experiments. When compared with the state-of-the-art BIQA algorithms, as well as representative full-reference IQA metrics, such as SSIM, STAIND shows fairly good performance in terms of quality prediction accuracy, stability, robustness, and computational costs.

  1. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    Directory of Open Access Journals (Sweden)

    Amzal Billy

    2011-02-01

    Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.

  2. Statistical power to detect change in a mangrove shoreline fish community adjacent to a nuclear power plant.

    Science.gov (United States)

    Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E

    2016-03-01

    An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.

  3. Quantitative assessment of aquatic impacts of power plants

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Arnold, E.M.; Skalski, J.R.; Fickeisen, D.H.; Baker, K.S.

    1979-08-01

    Progress is reported in a continuing study of the design and analysis of aquatic environmental monitoring programs for assessing the impacts of nuclear power plants. Analysis of data from Calvert Cliffs, Pilgrim, and San Onofre nuclear power plants confirmed the generic applicability of the control-treatment pairing design suggested by McKenzie et al. (1977). Substantial progress was made on the simulation model evaluation task. A process notebook was compiled in which each model equation was translated into a standardized notation. Individual model testing and evaluating was started. The Aquatic Generalized Environmental Impact Simulator (AGEIS) was developed and will be tested using data from Lake Keowee, South Carolina. Further work is required to test the various models and perfect AGEIS for impact analyses at actual power plant sites. Efforts on the hydrologic modeling task resulted in a compendium of models commonly applied to nuclear power plants and the application of two well-received hydrodynamic models to data from the Surry Nuclear Power Plant in Virginia. Conclusions from the study of these models indicate that slight inaccuracies of boundary data have little influence on mass conservation and accurate bathymetry data are necessary for conservation of mass through the model calculations. The hydrologic modeling task provides valuable reference information for model users and monitoring program designers.

  4. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature.

    Science.gov (United States)

    Szucs, Denes; Ioannidis, John P A

    2017-03-01

    We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64-1.46) for nominally statistically significant results and D = 0.24 (0.11-0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience.

  5. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature

    Science.gov (United States)

    Szucs, Denes; Ioannidis, John P. A.

    2017-01-01

    We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64–1.46) for nominally statistically significant results and D = 0.24 (0.11–0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience. PMID:28253258

  6. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Science.gov (United States)

    2010-01-01

    Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods

  7. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  8. Sex differences in discriminative power of volleyball game-related statistics.

    Science.gov (United States)

    João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime

    2010-12-01

    To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.

  9. Preliminary environmental assessment for the satellite power system (SPS)

    Energy Technology Data Exchange (ETDEWEB)

    1978-10-01

    A preliminary assessment of the impact of the Satellite Power System (SPS) on the environment is presented. Information that has appeared in documents referenced herein is integrated and assimilated. The state-of-knowledge as perceived from recently completed DOE-sponsored studies is disclosed, and prospective research and study programs that can advance the state-of-knowledge and provide an expanded data base for use in an assessment planned for 1980 are defined. Alternatives for research that may be implemented in order to achieve this advancement are also discussed in order that a plan can be selected which will be consistent with the fiscal and time constraints on the SPS Environmental Assessment Program. Health and ecological effects of microwave radiation, nonmicrowave effects on health and the environment (terrestrial operations and space operations), effects on the atmosphere, and effects on communications systems are examined in detail. (WHK)

  10. Observer variability in the assessment of type and dysplasia of colorectal adenomas, analyzed using kappa statistics

    DEFF Research Database (Denmark)

    Jensen, P; Krogsgaard, M R; Christiansen, J

    1995-01-01

    of adenomas were assessed twice by three experienced pathologists, with an interval of two months. Results were analyzed using kappa statistics. RESULTS: For agreement between first and second assessment (both type and grade of dysplasia), kappa values for the three specialists were 0.5345, 0.9022, and 0....... The kappa values for Observer A vs. B and Observer C vs. B were 0.3480 and 0.3770, respectively (both type and dysplasia). Values for type were better than for dysplasia, but agreement was only fair to moderate. CONCLUSION: The interobserver agreement was moderate to almost perfect, but the intraobserver...... agreement was only fair to moderate. A simpler classification system or a centralization of assessments would probably increase kappa values....

  11. The Development of Statistics Textbook Supported with ICT and Portfolio-Based Assessment

    Science.gov (United States)

    Hendikawati, Putriaji; Yuni Arini, Florentina

    2016-02-01

    This research was development research that aimed to develop and produce a Statistics textbook model that supported with information and communication technology (ICT) and Portfolio-Based Assessment. This book was designed for students of mathematics at the college to improve students’ ability in mathematical connection and communication. There were three stages in this research i.e. define, design, and develop. The textbooks consisted of 10 chapters which each chapter contains introduction, core materials and include examples and exercises. The textbook developed phase begins with the early stages of designed the book (draft 1) which then validated by experts. Revision of draft 1 produced draft 2 which then limited test for readability test book. Furthermore, revision of draft 2 produced textbook draft 3 which simulated on a small sample to produce a valid model textbook. The data were analysed with descriptive statistics. The analysis showed that the Statistics textbook model that supported with ICT and Portfolio-Based Assessment valid and fill up the criteria of practicality.

  12. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  13. Optimal Power Allocation for CC-HARQ-based Cognitive Radio with Statistical CSI in Nakagami Slow Fading Channels

    Science.gov (United States)

    Xu, Ding; Li, Qun

    2017-01-01

    This paper addresses the power allocation problem for cognitive radio (CR) based on hybrid-automatic-repeat-request (HARQ) with chase combining (CC) in Nakagamimslow fading channels. We assume that, instead of the perfect instantaneous channel state information (CSI), only the statistical CSI is available at the secondary user (SU) transmitter. The aim is to minimize the SU outage probability under the primary user (PU) interference outage constraint. Using the Lagrange multiplier method, an iterative and recursive algorithm is derived to obtain the optimal power allocation for each transmission round. Extensive numerical results are presented to illustrate the performance of the proposed algorithm.

  14. Influence of motor unit firing statistics on the median frequency of the EMG power spectrum

    NARCIS (Netherlands)

    van Boxtel, Anton; Schomaker, L R

    1984-01-01

    Changes in the EMG power spectrum during static fatiguing contractions are often attributed to changes in muscle fibre action potential conduction velocity. Mathematical models of the EMG power spectrum, which have been empirically confirmed, predict that under certain conditions a distinct maximum

  15. Understanding Statistical Power in Cluster Randomized Trials: Challenges Posed by Differences in Notation and Terminology

    Science.gov (United States)

    Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael

    2014-01-01

    Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…

  16. Suppressing the non-Gaussian statistics of Renewable Power from Wind and Solar

    CERN Document Server

    Anvari, M; Tabar, M Reza Rahimi; Wächter, M; Milan, P; Heinemann, D; Peinke, Joachim; Lorenz, E

    2015-01-01

    The power from wind and solar exhibits a nonlinear flickering variability, which typically occurs at time scales of a few seconds. We show that high-frequency monitoring of such renewable powers enables us to detect a transition, controlled by the field size, where the output power qualitatively changes its behaviour from a flickering type to a diffusive stochastic behaviour. We find that the intermittency and strong non-Gaussian behavior in cumulative power of the total field, even for a country-wide installation still survives for both renewable sources. To overcome the short time intermittency, we introduce a time-delayed feedback method for power output of wind farm and solar field that can change further the underlying stochastic process and suppress their strong non- gaussian fluctuations.

  17. OVERVIEW OF ENVIRONMENTAL ASSESSMENT FOR CHINA NUCLEAR POWER INDUSTRY AND COAL—FIRED POWER INDUSTRY

    Institute of Scientific and Technical Information of China (English)

    张少华; 潘自强; 等

    1994-01-01

    A quantitative environmental assessment method and the corresponding computer code are introduced in this paper.By the consideration of all fuel cycle steps,it gives that the public health risk of China nuclear power industry is 5.2×10-1man/(GW.a),the occupational health risk is 2.5man/(GW.a).and the total health risk is 3.0man/(GW.a0.After the health risk calculation for coal mining,transport,burning up and ash disposal,it gives that the public health risk of China cola-fired power industry is 3.6man/(GW.a).the occupational health risk is 50man/(GW.a),and the total is 54man/(GW.a),Accordingly,the conclusion that China nuclear power industry is an industry with high safety and cleanness is derived at the end.

  18. The N-pact factor: evaluating the quality of empirical journals with respect to sample size and statistical power.

    Science.gov (United States)

    Fraley, R Chris; Vazire, Simine

    2014-01-01

    The authors evaluate the quality of research reported in major journals in social-personality psychology by ranking those journals with respect to their N-pact Factors (NF)-the statistical power of the empirical studies they publish to detect typical effect sizes. Power is a particularly important attribute for evaluating research quality because, relative to studies that have low power, studies that have high power are more likely to (a) to provide accurate estimates of effects, (b) to produce literatures with low false positive rates, and (c) to lead to replicable findings. The authors show that the average sample size in social-personality research is 104 and that the power to detect the typical effect size in the field is approximately 50%. Moreover, they show that there is considerable variation among journals in sample sizes and power of the studies they publish, with some journals consistently publishing higher power studies than others. The authors hope that these rankings will be of use to authors who are choosing where to submit their best work, provide hiring and promotion committees with a superior way of quantifying journal quality, and encourage competition among journals to improve their NF rankings.

  19. Power spectral density in balance assessment. Description of methodology.

    Science.gov (United States)

    Syczewska, Małgorzata; Zielińska, Teresa

    2010-01-01

    One of the methods used in clinical setting to assess the balance function is the measurement of the centre of pressure trajectory (COP). The COP trajectory is strongly dependent on the body centre of mass trajectory (COM), but in case of balance problems the corrective signals influence this dependence. The aim of the present study is to explore the possibility of using power spectral density function of the COP vs. COM signal in assessing the amount of correction signals. As the aim was a methodological one, only one healthy adult subject participated in the study. This subject performed five balance tasks of increasing difficulty. The COP trajectory was recorded using the Kistler force plate, and COM trajectory was calculated based on the marker trajectories placed on the subject's body and simultaneously recorded with VICON 460 system. The COM data were subtracted from COP trajectory in anteroposterior (AP) and lateral direction. Next the power spectral density (PSD) was calculated for the new signals. The power spectral density is very low for easiest condition, but increases with the difficulty of task. Moreover, it also provides information in which plane (sagittal or frontal) more correction movements are needed to maintain stability.

  20. Wide Area Measurement Based Security Assessment & Monitoring of Modern Power System: A Danish Power System Case Study

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2013-01-01

    Power System security has become a major concern across the global power system community. This paper presents wide area measurement system (WAMS) based security assessment and monitoring of modern power system. A new three dimensional security index (TDSI) has been proposed for online security...... monitoring of modern power system with large scale renewable energy penetration. Phasor measurement unit (PMU) based WAMS has been implemented in western Danish Power System to realize online security monitoring and assessment in power system control center. The proposed security monitoring system has been...

  1. Assessment of environmental external effects in power generation

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, H.; Morthorst, P.E.; Schleisner, L. [Risoe National Lab. (Denmark); Meyer, N.I.; Nielsen, P.S.; Nielsen, V. [The Technical Univ. of Denmark (Denmark)

    1996-12-01

    This report summarises some of the results achieved in a project carried out in Denmark in 1994 concerning externalities. The main objective was to identify, quantify and - if possible - monetize the external effects in the production of energy, especially in relation to renewable technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared to the production of electricity based on a coal-fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas. In the report the individual externalities from the different ways of producing energy are identified, the stress caused by the effect is assessed, and finally the monetary value of the damage is estimated. The method is applied to the local as well as the regional and global externalities. (au) 8 tabs., 7 ills., 4 refs.

  2. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-01-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  3. Quadrennial Technology Review 2015: Technology Assessments--Wind Power

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2015-10-07

    Wind power has become a mainstream power source in the U.S. electricity portfolio, supplying 4.9% of the nation’s electricity demand in 2014. With more than 65 GW installed across 39 states at the end of 2014, utility-scale wind power is a cost-effective source of low-emissions power generation throughout much of the nation. The United States has significant sustainable land-based and offshore wind resource potential, greater than 10 times current total U.S. electricity consumption. A technical wind resource assessment conducted by the Department of Energy (DOE) in 2009 estimated that the land-based wind energy potential for the contiguous United States is equivalent to 10,500 GW capacity at 80 meters (m) hub and 12,000 GW capacity at 100 meters (m) hub heights, assuming a capacity factor of at least 30%. A subsequent 2010 DOE report estimated the technical offshore wind energy potential to be 4,150 GW. The estimate was calculated from the total offshore area within 50 nautical miles of shore in areas where average annual wind speeds are at least 7 m per second at a hub height of 90 m.

  4. Market assessment of photovoltaic power systems for agricultural applications worldwide

    Science.gov (United States)

    Cabraal, A.; Delasanta, D.; Rosen, J.; Nolfi, J.; Ulmer, R.

    1981-11-01

    Agricultural sector PV market assessments conducted in the Phillippines, Nigeria, Mexico, Morocco, and Colombia are extrapolated worldwide. The types of applications evaluated are those requiring less than 15 kW of power and operate in a stand alone mode. The major conclusions were as follows: PV will be competitive in applications requiring 2 to 3 kW of power prior to 1983; by 1986 PV system competitiveness will extend to applications requiring 4 to 6 kW of power, due to capital constraints, the private sector market may be restricted to applications requiring less than about 2 kW of power; the ultimate purchase of larger systems will be governments, either through direct purchase or loans from development banks. Though fragmented, a significant agriculture sector market for PV exists; however, the market for PV in telecommunications, signalling, rural services, and TV will be larger. Major market related factors influencing the potential for U.S. PV Sales are: lack of awareness; high first costs; shortage of long term capital; competition from German, French and Japanese companies who have government support; and low fuel prices in capital surplus countries. Strategies that may aid in overcoming some of these problems are: setting up of a trade association aimed at overcoming problems due to lack of awareness, innovative financing schemes such as lease arrangements, and designing products to match current user needs as opposed to attempting to change consumer behavior.

  5. Power Spectrum Analysis and Missing Level Statistics of Microwave Graphs with Violated Time Reversal Invariance

    Science.gov (United States)

    Białous, Małgorzata; Yunko, Vitalii; Bauch, Szymon; Ławniczak, Michał; Dietz, Barbara; Sirko, Leszek

    2016-09-01

    We present experimental studies of the power spectrum and other fluctuation properties in the spectra of microwave networks simulating chaotic quantum graphs with violated time reversal invariance. On the basis of our data sets, we demonstrate that the power spectrum in combination with other long-range and also short-range spectral fluctuations provides a powerful tool for the identification of the symmetries and the determination of the fraction of missing levels. Such a procedure is indispensable for the evaluation of the fluctuation properties in the spectra of real physical systems like, e.g., nuclei or molecules, where one has to deal with the problem of missing levels.

  6. Blinking in quantum dots: The origin of the grey state and power law statistics

    Science.gov (United States)

    Ye, Mao; Searson, Peter C.

    2011-09-01

    Quantum dot (QD) blinking is characterized by switching between an “on” state and an “off” state, and a power-law distribution of on and off times with exponents from 1.0 to 2.0. The origin of blinking behavior in QDs, however, has remained a mystery. Here we describe an energy-band model for QDs that captures the full range of blinking behavior reported in the literature and provides new insight into features such as the gray state, the power-law distribution of on and off times, and the power-law exponents.

  7. Optical methodology for the health assessment of power transformers

    Science.gov (United States)

    Palmer, John A.; Wang, Xianghui; Shoureshi, Rahmat A.; Mander, Arthur A.; Torgerson, Duane

    2000-06-01

    Among the most critical components in the electric power system is the power transformer. As such, a significant body of research has been put forward to attempt to anticipate the needs for maintenance to be performed. Traditional health assessment has required sampling of oil for submission to a laboratory for analysis, but this has been deemed undesirable in light of budgetary constraints on maintenance staffing, and new predictive maintenance philosophies for substation equipment. A number of processes have been developed in recent years for online health assessment of transformers, most of which have focused on dissolved gas analysis. This paper describes a novel optical methodology for on-line transformer health assessment that utilizes an ultraviolet absorption measurement to identify the degradation of the transformer oil. An optical system was selected because of its immunity to the electromagnetic noise typical of substations, and because of the minimal impact that non-conducting materials have on the insulation system design of the transformer. The system is designed to identify deterioration and premature aging resulting from overheating, low level arcing or excessive exposure to atmospheric air. The system consists of a light source, filter, guide and detection components, and a very simple computational requirement. The measurements performed with the prototype system are validated with a high precision spectrophotometry measurement and an independent oil-testing laboratory.

  8. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  9. Security assessment for intentional island operation in modern power system

    DEFF Research Database (Denmark)

    Chen, Yu; Xu, Zhao; Østergaard, Jacob

    2011-01-01

    operator can clearly know if it is suitable to conduct island operation at one specific moment. Besides, in order to improve the computation efficiency, the Artificial Neural Network (ANN) is applied for fast ISR formation. Thus, online application of ISR based islanding security assessment could......There has been a high penetration level of Distributed Generations (DGs) in distribution systems in Denmark. Even more DGs are expected to be installed in the coming years. With that, to utilize them in maintaining the security of power supply is of great concern for Danish utilities. During...... the emergency in the power system, some distribution networks may be intentionally separated from the main grid to avoid complete system collapse. If DGs in those networks could continuously run instead of immediately being shut down, the blackout could be avoided and the reliability of supply could...

  10. A follow-up power analysis of the statistical tests used in the Journal of Research in Science Teaching

    Science.gov (United States)

    Woolley, Thomas W.; Dawson, George O.

    It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.

  11. Characterizing Key Developmental Understandings and Pedagogically Powerful Ideas within a Statistical Knowledge for Teaching Framework

    Science.gov (United States)

    Groth, Randall E.

    2013-01-01

    A hypothetical framework to characterize statistical knowledge for teaching (SKT) is described. Empirical grounding for the framework is provided by artifacts from an undergraduate course for prospective teachers that concentrated on the development of SKT. The theoretical notion of "key developmental understanding" (KDU) is used to identify…

  12. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    Science.gov (United States)

    Porter, Kristin E.

    2016-01-01

    In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…

  13. The Surprising Power of Statistical Learning: When Fragment Knowledge Leads to False Memories of Unheard Words

    Science.gov (United States)

    Endress, Ansgar D.; Mehler, Jacques

    2009-01-01

    Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…

  14. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  15. A Framework for Assessing the Commercialization of Photovoltaic Power Generation

    Science.gov (United States)

    Yaqub, Mahdi

    An effective framework does not currently exist with which to assess the viability of commercializing photovoltaic (PV) power generation in the US energy market. Adopting a new technology, such as utility-scale PV power generation, requires a commercialization assessment framework. The framework developed here assesses the economic viability of a set of alternatives of identified factors. Economic viability focuses on simulating the levelized cost of electricity (LCOE) as a key performance measure to realize `grid parity', or the equivalence between the PV electricity prices and grid electricity prices for established energy technologies. Simulation results confirm that `grid parity' could be achieved without the current federal 30% investment tax credit (ITC) via a combination of three strategies: 1) using economies of scale to reduce the LCOE by 30% from its current value of 3.6 cents/kWh to 2.5 cents/kWh, 2) employing a longer power purchase agreement (PPA) over 30 years at a 4% interest rate, and 3) improving by 15% the "capacity factor", which is the ratio of the total annual generated energy to the full potential annual generation when the utility is continuously operating at its rated output. The lower than commercial-market interest rate of 4% that is needed to realize `grid parity' is intended to replace the current federal 30% ITC subsidy, which does not have a cash inflow to offset the outflow of subsidy payments. The 4% interest rate can be realized through two proposed finance plans: The first plan involves the implementation of carbon fees on polluting power plants to produce the capital needed to lower the utility PPA loan term interest rate from its current 7% to the necessary 4% rate. The second plan entails a proposed public debt finance plan. Under this plan, the US Government leverages its guarantee power to issue bonds and uses the proceeds to finance the construction and operation of PV power plants with PPA loan with a 4% interest rate for a

  16. Statistical tests with accurate size and power for balanced linear mixed models.

    Science.gov (United States)

    Muller, Keith E; Edwards, Lloyd J; Simpson, Sean L; Taylor, Douglas J

    2007-08-30

    The convenience of linear mixed models for Gaussian data has led to their widespread use. Unfortunately, standard mixed model tests often have greatly inflated test size in small samples. Many applications with correlated outcomes in medical imaging and other fields have simple properties which do not require the generality of a mixed model. Alternately, stating the special cases as a general linear multivariate model allows analysing them with either the univariate or multivariate approach to repeated measures (UNIREP, MULTIREP). Even in small samples, an appropriate UNIREP or MULTIREP test always controls test size and has a good power approximation, in sharp contrast to mixed model tests. Hence, mixed model tests should never be used when one of the UNIREP tests (uncorrected, Huynh-Feldt, Geisser-Greenhouse, Box conservative) or MULTIREP tests (Wilks, Hotelling-Lawley, Roy's, Pillai-Bartlett) apply. Convenient methods give exact power for the uncorrected and Box conservative tests. Simulations demonstrate that new power approximations for all four UNIREP tests eliminate most inaccuracy in existing methods. In turn, free software implements the approximations to give a better choice of sample size. Two repeated measures power analyses illustrate the methods. The examples highlight the advantages of examining the entire response surface of power as a function of sample size, mean differences, and variability.

  17. Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.

    2012-08-01

    Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.

  18. Algebraic Statistics

    OpenAIRE

    Norén, Patrik

    2013-01-01

    Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...

  19. A Powerful Test of the Autoregressive Unit Root Hypothesis Based on a Tuning Parameter Free Statistic

    DEFF Research Database (Denmark)

    Nielsen, Morten Ørregaard

    reflects the parameter chosen to implement the test, and (iii) since the asymptotic distribution depends on d and the test remains consistent for all d > 0, it is possible to analyze the power of the test for different values of d. The usual Phillips-Perron or Dickey-Fuller type tests are indexed...... good size properties, with finite sample power that is higher than that of Breitung's (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented Dickey-Fuller test with lag length chosen by an information criterion....

  20. Informing the judgments of fingerprint analysts using quality metric and statistical assessment tools.

    Science.gov (United States)

    Langenburg, Glenn; Champod, Christophe; Genessay, Thibault

    2012-06-10

    The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison. The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.

  1. Discriminatory power of game-related statistics in 14-15 year age group male volleyball, according to set.

    Science.gov (United States)

    García-Hermoso, Antonio; Dávila-Romero, Carlos; Saavedra, Jose M

    2013-02-01

    This study compared volleyball game-related statistics by outcome (winners and losers of sets) and set number (total, initial, and last) to identify characteristics that discriminated game performance. Game-related statistics from 314 sets (44 matches) played by teams of male 14- to 15-year-olds in a regional volleyball championship were analysed (2011). Differences between contexts (winning or losing teams) and "set number" (total, initial, and last) were assessed. A discriminant analysis was then performed according to outcome (winners and losers of sets) and "set number" (total, initial, and last). The results showed differences (winning or losing sets) in several variables of Complexes I (attack point and error reception) and II (serve and aces). Game-related statistics which discriminate performance in the sets index the serve, positive reception, and attack point. The predictors of performance at these ages when players are still learning could help coaches plan their training.

  2. Aging assessment of surge protective devices in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.F.; Subudhi, M. [Brookhaven National Lab., Upton, NY (United States); Carroll, D.P. [Florida Univ., Gainesville, FL (United States)

    1996-01-01

    An assessment was performed to determine the effects of aging on the performance and availability of surge protective devices (SPDs), used in electrical power and control systems in nuclear power plants. Although SPDs have not been classified as safety-related, they are risk-important because they can minimize the initiating event frequencies associated with loss of offsite power and reactor trips. Conversely, their failure due to age might cause some of those initiating events, e.g., through short circuit failure modes, or by allowing deterioration of the safety-related component(s) they are protecting from overvoltages, perhaps preventing a reactor trip, from an open circuit failure mode. From the data evaluated during 1980--1994, it was found that failures of surge arresters and suppressers by short circuits were neither a significant risk nor safety concern, and there were no failures of surge suppressers preventing a reactor trip. Simulations, using the ElectroMagnetic Transients Program (EMTP) were performed to determine the adequacy of high voltage surge arresters.

  3. A statistical assessment of population trends for data deficient Mexican amphibians

    Directory of Open Access Journals (Sweden)

    Esther Quintero

    2014-12-01

    Full Text Available Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  4. In vivo Comet assay – statistical analysis and power calculations of mice testicular cells

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne

    2014-01-01

    is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636...

  5. The power of 41%: A glimpse into the life of a statistic.

    Science.gov (United States)

    Tanis, Justin

    2016-01-01

    "Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record

  6. Planck 2013 results. XXI. All-sky Compton parameter power spectrum and high-order statistics

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Genova-Santos, R.T.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. These maps show an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales ($\\ell 500$) the clustered Cosmic Infrared Background (CIB) and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We measure the tSZ power spectrum in angular scales, $0.17^{\\circ} \\lesssim \\theta \\lesssim 3.0^{\\circ}$, that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with additional clear evidence of signal from unresolved clusters and, potentially, diffuse warm baryons. We use the tSZ power spectrum to ...

  7. Point Processes Modeling of Time Series Exhibiting Power-Law Statistics

    CERN Document Server

    Kaulakys, B; Gontis, V

    2010-01-01

    We consider stochastic point processes generating time series exhibiting power laws of spectrum and distribution density (Phys. Rev. E 71, 051105 (2005)) and apply them for modeling the trading activity in the financial markets and for the frequencies of word occurrences in the language.

  8. Statistical Analysis of Power Production from OWC Type Wave Energy Converters

    DEFF Research Database (Denmark)

    Martinelli, L.; Zanuttigh, B.; Kofoed, Jens Peter

    2009-01-01

    a method that allows the choice of the optimal power generation capacity for which the device should be designed, when subjected to any given wave climate. The analysis is based on the experimental results of existing tests carried out in the 3D deep water wave tank at Aalborg University, Denmark. First...

  9. Use assessment of electronic power sources for SMAW

    Directory of Open Access Journals (Sweden)

    Scotti, A.

    1999-04-01

    Full Text Available The aim of the present work was to assess the efficacy of the use of modern technologies for power supplies in Shielded Metal Are Welding (SMAW. Coupon tests were welded by using a series of five different classes of commercial electrodes, covering their current ranges. Both a conventional electromagnetic and an electronic (inverter power sources were employed. Fusion rate, deposition efficiency, bead finish and weld geometry were measured at each experiment. Current and voltage signals were acquired at a high rate to evaluate the dynamic behavior of the power sources. The static performances of both power sources were also determined. The results showed that despite the remarkable differences between the power supplies, based on static and dynamic characterizations, no significant difference was noticed in the operational behavior of the electrodes, in the given conditions, apart from a better anti-stick performance obtained with the electronic power source.

    El objetivo del presente trabajo fue evaluar la eficacia del uso de tecnologías modernas para fuentes de energía en soldaduras con electrodo revestido (Shielded Metal Are Welding -SMAW-. Los materiales de ensayo se soldaron usando una serie de cinco clases diferentes de electrodos comerciales, cubriendo sus rangos de corriente. Para esto se utilizó una fuente de energía electromagnética convencional y una fuente de energía electrónica (inversora. La tasa de fusión, eficiencia de deposición, terminación del cordón así como el diseño de la soldadura se midieron en cada experimento. Las señales de corriente y voltaje se obtuvieron a una proporción alta para evaluar el comportamiento dinámico de las fuentes de energía. También se determinó la actuación estática de ambas fuentes. Los resultados mostraron que a pesar de las diferencias notables entre los suministros de energía, no se nota diferencia alguna significante en la conducta de trabajo de los electrodos, en

  10. Selection of nontarget arthropod taxa for field research on transgenic insecticidal crops: using empirical data and statistical power.

    Science.gov (United States)

    Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J

    2008-02-01

    One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.

  11. Statistical and Measurement Properties of Features Used in Essay Assessment. Research Report. ETS RR-04-21

    Science.gov (United States)

    Haberman, Shelby J.

    2004-01-01

    Statistical and measurement properties are examined for features used in essay assessment to determine the generalizability of the features across populations, prompts, and individuals. Data are employed from TOEFL® and GMAT® examinations and from writing for Criterion?.

  12. Increasing Confidence in a Statistics Course: Assessing Students' Beliefs and Using the Data to Design Curriculum with Students

    Science.gov (United States)

    Huchting, Karen

    2013-01-01

    Students were involved in the curriculum design of a statistics course. They completed a pre-assessment of their confidence and skills using quantitative methods and statistics. Scores were aggregated, and anonymous data were shown on the first night of class. Using these data, the course was designed, demonstrating evidence-based instructional…

  13. Increasing Confidence in a Statistics Course: Assessing Students' Beliefs and Using the Data to Design Curriculum with Students

    Science.gov (United States)

    Huchting, Karen

    2013-01-01

    Students were involved in the curriculum design of a statistics course. They completed a pre-assessment of their confidence and skills using quantitative methods and statistics. Scores were aggregated, and anonymous data were shown on the first night of class. Using these data, the course was designed, demonstrating evidence-based instructional…

  14. Preliminary environmental assessment for the Satellite Power System (SPS). Revision 1. Volume 2. Detailed assessment

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The Department of Energy (DOE) is considering several options for generating electrical power to meet future energy needs. The satellite power system (SPS), one of these options, would collect solar energy through a system of satellites in space and transfer this energy to earth. A reference system has been described that would convert the energy to microwaves and transmit the microwave energy via directive antennas to large receiving/rectifying antennas (rectennas) located on the earth. At the rectennas, the microwave energy would be converted into electricity. The potential environmental impacts of constructing and operating the satellite power system are being assessed as a part of the Department of Energy's SPS Concept Development and Evaluation Program. This report is Revision I of the Preliminary Environmental Assessment for the Satellite Power System published in October 1978. It refines and extends the 1978 assessment and provides a basis for a 1980 revision that will guide and support DOE recommendations regarding future SPS development. This is Volume 2 of two volumes. It contains the technical detail suitable for peer review and integrates information appearing in documents referenced herein. The key environmental issues associated with the SPS concern human health and safety, ecosystems, climate, and electromagnetic systems interactions. In order to address these issues in an organized manner, five tasks are reported: (I) microwave-radiation health and ecological effects; (II) nonmicrowave health and ecological effectss; (III) atmospheric effects; (IV) effects on communication systems due to ionospheric disturbance; and (V) electromagnetic compatibility. (WHK)

  15. On the Statistics of Noisy Space Vector in Power Quality Analysis

    Directory of Open Access Journals (Sweden)

    Diego Bellan

    2016-10-01

    Full Text Available This work deals with the analysis of the impact of additive noise on the geometrical features of the space vector shape on the complex plane. The space vector shape is of great importance in the power quality analysis of modern three-phase power systems since its geometrical features are closely related to the voltage supply quality. In case of voltage sag or swell the space vector shape changes accordingly. In case of additive noise the geometrical parameters of the space vector shape can be treated as random variables. In the paper, the mean values and the variances of such parameters are derived in closed form as functions of the noise level and of the sampling conditions. Analytical results are validated through numerical simulation of the whole measurement process.

  16. Statistical analysis of data from limiting dilution cloning to assess monoclonality in generating manufacturing cell lines.

    Science.gov (United States)

    Quiroz, Jorge; Tsao, Yung-Shyeng

    2016-07-08

    Assurance of monoclonality of recombinant cell lines is a critical issue to gain regulatory approval in biological license application (BLA). Some of the requirements of regulatory agencies are the use of proper documentations and appropriate statistical analysis to demonstrate monoclonality. In some cases, one round may be sufficient to demonstrate monoclonality. In this article, we propose the use of confidence intervals for assessing monoclonality for limiting dilution cloning in the generation of recombinant manufacturing cell lines based on a single round. The use of confidence intervals instead of point estimates allow practitioners to account for the uncertainty present in the data when assessing whether an estimated level of monoclonality is consistent with regulatory requirements. In other cases, one round may not be sufficient and two consecutive rounds are required to assess monoclonality. When two consecutive subclonings are required, we improved the present methodology by reducing the infinite series proposed by Coller and Coller (Hybridoma 1983;2:91-96) to a simpler series. The proposed simpler series provides more accurate and reliable results. It also reduces the level of computation and can be easily implemented in any spreadsheet program like Microsoft Excel. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1061-1068, 2016.

  17. Network Theory Integrated Life Cycle Assessment for an Electric Power System

    Directory of Open Access Journals (Sweden)

    Heetae Kim

    2015-08-01

    Full Text Available In this study, we allocate Greenhouse gas (GHG emissions of electricity transmission to the consumers. As an allocation basis, we introduce energy distance. Energy distance takes the transmission load on the electricity energy system into account in addition to the amount of electricity consumption. As a case study, we estimate regional GHG emissions of electricity transmission loss in Chile. Life cycle assessment (LCA is used to estimate the total GHG emissions of the Chilean electric power system. The regional GHG emission of transmission loss is calculated from the total GHG emissions. We construct the network model of Chilean electric power grid as an undirected network with 466 nodes and 543 edges holding the topology of the power grid based on the statistical record. We analyze the total annual GHG emissions of the Chilean electricity energy system as 23.07 Mt CO2-eq. and 1.61 Mt CO2-eq. for the transmission loss, respectively. The total energy distance for the electricity transmission accounts for 12,842.10 TWh km based on network analysis. We argue that when the GHG emission of electricity transmission loss is estimated, the electricity transmission load should be separately considered. We propose network theory as a useful complement to LCA analysis for the complex allocation. Energy distance is especially useful on a very large-scale electric power grid such as an intercontinental transmission network.

  18. Planck 2013 results. XXI. Power spectrum and high-order statistics of the Planck all-sky Compton parameter map

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. This map shows an obvious galaxy cluster tSZ signal that is well matched w......-Gaussianity of the Compton parameter map is further characterized by computing its 1D probability distribution function and its bispectrum. The measured tSZ power spectrum and high order statistics are used to place constraints on sigma(8)....

  19. Does bisphenol A induce superfeminization in Marisa cornuarietis? Part II: toxicity test results and requirements for statistical power analyses.

    Science.gov (United States)

    Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert

    2007-03-01

    This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.

  20. Methodology for Assessment of Inertial Response from Wind Power Plants

    DEFF Research Database (Denmark)

    Altin, Müfit; Teodorescu, Remus; Bak-Jensen, Birgitte;

    2012-01-01

    High wind power penetration levels result in additional requirements from wind power in order to improve frequency stability. Replacement of conventional power plants with wind power plants reduces the power system inertia due to the wind turbine technology. Consequently, the rate of change...

  1. Sea cliff instability susceptibility at regional scale: A statistically based assessment in southern Algarve, Portugal.

    Science.gov (United States)

    Marques, F.; Matildes, R.; Redweik, P.

    2012-04-01

    Mass movements are the dominant process of sea cliff evolution, being a considerable source of natural hazard and a significant constrain for human activities in coastal areas. Related hazards include cliff top retreat, with implications on planning and land management, and unstable soil or rock movements at the cliffs face and toe, with implications mainly on beach users and support structures. To assess the spatial component of sea cliff hazard assessment with implications on planning, i.e. the susceptibility of a given cliff section to be affected by instabilities causing retreat of the cliff top, a statistically based study was carried out along the top of the sea cliffs of Burgau-Lagos coastal section (Southwest Algarve, Portugal). The study was based on bivariate and multi-variate statistics applied to a set of predisposing factors, mainly related with geology and geomorphology, which were correlated with an inventory of past cliff failures. The multi-temporal inventory of past cliff failures was produced using aerial digital photogrammetric methods, which included special procedures to enable the extraction of accurate data from old aerial photos, and validated by systematic stereo photo interpretation, helped by oblique aerial photos and field surveys. This study identified 137 cliff failures occurred between 1947 and 2007 along the 13 km long cliffs, causing the loss of 10,234 m2 of horizontal area at the cliffs top. The cliff failures correspond to planar slides (58%) mainly in Cretaceous alternating limestone and marls, toppling failures (17%) mainly in Miocene calcarenites, slumps (15%) in Plio-pleistocene silty sands that infill the karst in the Miocene rocks, and the remaining 10% correspond to complex movements, rockfalls and not determined cases. The space distribution of cliff failures is quite irregular but enables the objective separation of sub sections with homogeneous retreat behavior, for which were computed mean retreat rates between 5x10-3m

  2. Assessment of applications of conducting polymers in power equipment

    Energy Technology Data Exchange (ETDEWEB)

    Schock, K.F.; Bennett, A.I.; Burghardt, R.R.; Cookson, A.H.; Saunders, H.E.; Smith, J.D.B (Westinghouse Electric Corp., Pittsburgh, PA (United States)); Kennedy, W.N. (ABB Power Transmission and Distribution Co., Muncie, IN (United States)); Oommen, T.V. (ABB Power Transmission and Distribution Co., Raleigh, NC (United States)); Voshall, R.E. (Gannon Coll., Erie, PA (United States)); Fort, E.M. (Westinghouse Electric Corp., Orlando, FL (United States))

    1992-10-01

    A feasibility study has been completed to assess potential applications of conducting polymers in the manufacture of power equipment. Ten areas were studied: solid and liquid dielectric cables, capacitors, rotating machinery, transformers, bushings, surge suppressors, vacuum interrupters, gas-insulated equipment, and other miscellaneous applications. Each application was rated according to technical impact, probability of success, economic impact and time frame for implementation. of the 32 potential applications proposed, the top ranking areas were: coating of dielectric films for capacitors, conducting compounds to cover conductors in rotating machines, surface coatings to dissipate charges in bushing, coatings for controlled surface conductivity in gas insulated equipment and thermal history monitors. Finally, the issues that have to be resolved before conducting polymers can find use in the identified applications are discussed in this paper.

  3. Assessment on thermoelectric power factor in silicon nanowire networks

    Energy Technology Data Exchange (ETDEWEB)

    Lohn, Andrew J.; Kobayashi, Nobuhiko P. [Baskin School of Engineering, University of California Santa Cruz, CA (United States); Nanostructured Energy Conversion Technology and Research (NECTAR), Advanced Studies Laboratories, University of California Santa Cruz, NASA Ames Research Center, Moffett Field, CA (United States); Coleman, Elane; Tompa, Gary S. [Structured Materials Industries, Inc., Piscataway, NJ (United States)

    2012-01-15

    Thermoelectric devices based on three-dimensional networks of highly interconnected silicon nanowires were fabricated and the parameters that contribute to the power factor, namely the Seebeck coefficient and electrical conductivity were assessed. The large area (2 cm x 2 cm) devices were fabricated at low cost utilizing a highly scalable process involving silicon nanowires grown on steel substrates. Temperature dependence of the Seebeck coefficient was found to be weak over the range of 20-80 C at approximately -400 {mu}V/K for unintentionally doped devices and {+-}50 {mu}V/K for p-type and n-type devices, respectively. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  4. Experimental Assessment of Derating Guidelines Applied to Power Electronics Converters

    Directory of Open Access Journals (Sweden)

    S. E. De León-Aldaco

    2013-01-01

    Full Text Available ABSTRACTPower transistors are the most vulnerable components in switching converters, and derating is usually applied toincrease their reliability. In this paper, the effectiveness of derating guidelines is experimentally assessed using apush-pull DC-DC converter as a case study, operating in three different environments. After measuring the electricalvariables and temperature, reliability was predicted following the guidelines in MIL HDBK 217F. The sensitivityanalysis performed indicates that temperature has the largest impact on reliability, followed by environment anddevice quality. The results obtained demonstrate that a derating procedure based solely on DC ratings does notensure an adequate performance. Therefore, additional guidelines are suggested to help increase the overallreliability obtained from a power circuit.

  5. Sea cliff instability susceptibility at regional scale: a statistically based assessment in southern Algarve, Portugal

    Directory of Open Access Journals (Sweden)

    F. M. S. F. Marques

    2013-05-01

    along their top. The study was based on the application of the bivariate Information Value and multivariate Logistic regression statistical methods, using a set of predisposing factors for cliff failures, mainly related with geology (lithology, bedding dip, faults and geomorphology (maximum and mean slope, height, aspect, plan curvature, toe protection which were correlated with a photogrammetry based inventory of cliff failures occurred in a 60 yr period (1947–2007. The susceptibility models were validated against the inventory data using standard success rate and ROC curves, and provided encouraging results, indicating that the proposed approaches are effective for susceptibility assessment. The results obtained also stress the need for improvement of the predisposing factors to be used in this type of studies and the need of detailed and systematic cliff failures inventories.

  6. Robust statistical approaches to assess the degree of agreement of clinical data

    Science.gov (United States)

    Grilo, Luís M.; Grilo, Helena L.

    2016-06-01

    To analyze the blood of patients who took vitamin B12 for a period of time, two different medicine measurement methods were used (one is the established method, with more human intervention, and the other method uses essentially machines). Given the non-normality of the differences between both measurement methods, the limits of agreement are estimated using also a non-parametric approach to assess the degree of agreement of the clinical data. The bootstrap resampling method is applied in order to obtain robust confidence intervals for mean and median of differences. The approaches used are easy to apply, running a friendly software, and their outputs are also easy to interpret. In this case study the results obtained with (non)parametric approaches lead us to different statistical conclusions, but the decision whether agreement is acceptable or not is always a clinical judgment.

  7. Multivariate statistical techniques for the assessment of seasonal variations in surface water quality of pasture ecosystems.

    Science.gov (United States)

    Ajorlo, Majid; Abdullah, Ramdzani B; Yusoff, Mohd Kamil; Halim, Ridzwan Abd; Hanif, Ahmad Husni Mohd; Willms, Walter D; Ebrahimian, Mahboubeh

    2013-10-01

    This study investigates the applicability of multivariate statistical techniques including cluster analysis (CA), discriminant analysis (DA), and factor analysis (FA) for the assessment of seasonal variations in the surface water quality of tropical pastures. The study was carried out in the TPU catchment, Kuala Lumpur, Malaysia. The dataset consisted of 1-year monitoring of 14 parameters at six sampling sites. The CA yielded two groups of similarity between the sampling sites, i.e., less polluted (LP) and moderately polluted (MP) at temporal scale. Fecal coliform (FC), NO3, DO, and pH were significantly related to the stream grouping in the dry season, whereas NH3, BOD, Escherichia coli, and FC were significantly related to the stream grouping in the rainy season. The best predictors for distinguishing clusters in temporal scale were FC, NH3, and E. coli, respectively. FC, E. coli, and BOD with strong positive loadings were introduced as the first varifactors in the dry season which indicates the biological source of variability. EC with a strong positive loading and DO with a strong negative loading were introduced as the first varifactors in the rainy season, which represents the physiochemical source of variability. Multivariate statistical techniques were effective analytical techniques for classification and processing of large datasets of water quality and the identification of major sources of water pollution in tropical pastures.

  8. OPLS statistical model versus linear regression to assess sonographic predictors of stroke prognosis.

    Science.gov (United States)

    Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi

    2012-01-01

    The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.

  9. Proper Assessment of the JFK Assassination Bullet Lead Evidence from Metallurgical and Statistical Perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Randich, E; Grant, P M

    2006-08-29

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano, 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in Mannlicher-Carcano bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  10. Proper assessment of the JFK assassination bullet lead evidence from metallurgical and statistical perspectives.

    Science.gov (United States)

    Randich, Erik; Grant, Patrick M

    2006-07-01

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano (MC), 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in MC bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  11. Statistical Analysis of Meteorological Data to Assess Evapotranspiration and Infiltration at the Rifle Site, CO, USA

    Science.gov (United States)

    Faybishenko, B.; Long, P. E.; Tokunaga, T. K.; Christensen, J. N.

    2015-12-01

    Net infiltration to the vadose zone, especially in arid or semi-arid climates, is an important control on microbial activity and solute and green house gas fluxes. To assess net infiltration, we performed a statistical analysis of meteorological data as the basis for hydrological and climatic investigations and predictions for the Rifle site, Colorado, USA, located within a floodplain in a mountainous region along the Colorado River, with a semi-arid climate. We carried out a statistical analysis of meteorological 30-year time series data (1985-2015), including: (1) precipitation data, taking into account the evaluation of the snowmelt, (2) evaluation of the evapotranspiration (reference and actual), (3) estimation of the multi-time-scalar Standardized Precipitation-Evapotranspiration Index (SPEI), (4) evaluation of the net infiltration rate, and (5) corroborative analysis of calculated net infiltration rate and groundwater recharge from radioisotopic measurements from samples collected in 2013. We determined that annual net infiltration percentage of precipitation varies from 4.7% to ~18%, with a mean of ~10%, and concluded that calculations of net infiltration based on long-term meteorological data are comparable with those from strontium isotopic investigations. The evaluation of the SPEI showed the intermittent pattern of droughts and wet periods over the past 30 years, with a detectable decreasein the duration of droughts with time. Local measurements within the floodplain indicate a recharge gradient with increased recharge closer to the Colorado River.

  12. Assessment of toxic interactions of heavy metals in binary mixtures: A statistical approach

    Science.gov (United States)

    Ince; Dirilgen; Apikyan; Tezcanli; Ustun

    1999-05-01

    Toxicity of zinc, copper, cobalt, and chromium ions and their binary interactions were studied at varying test levels by using a battery of two tests, Microtox and duckweed with Vibrio fisheri and Lemna minor as test organisms, respectively. The type of toxic interaction at each test combination was assessed by a statistical approach based on testing the null hypothesis of "additive toxicity" at 95% confidence level. The interactions were called "antagonistic," "additive," or "synergistic" in accordance with the statistical significance and the sign of the difference between the tested hypothesis and the value of the observed toxicity at the binary test level concerned. In the majority of the combinations studied by the two bioassays, the interactions were of antagonistic nature. Additive toxicity was the next frequently predicted interaction in both test results, the frequency being much higher in Microtox responses than in those of duckweed. Finally, synergism was found to be a rare interaction in Microtox results, but totally unlikely in duckweed within the selected test combinations.

  13. A Statistical Method for Assessing Peptide Identification Confidence in Accurate Mass and Time Tag Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Stanley, Jeffrey R.; Adkins, Joshua N.; Slysz, Gordon W.; Monroe, Matthew E.; Purvine, Samuel O.; Karpievitch, Yuliya V.; Anderson, Gordon A.; Smith, Richard D.; Dabney, Alan R.

    2011-07-15

    High-throughput proteomics is rapidly evolving to require high mass measurement accuracy for a variety of different applications. Increased mass measurement accuracy in bottom-up proteomics specifically allows for an improved ability to distinguish and characterize detected MS features, which may in turn be identified by, e.g., matching to entries in a database for both precursor and fragmentation mass identification methods. Many tools exist with which to score the identification of peptides from LC-MS/MS measurements or to assess matches to an accurate mass and time (AMT) tag database, but these two calculations remain distinctly unrelated. Here we present a statistical method, Statistical Tools for AMT tag Confidence (STAC), which extends our previous work incorporating prior probabilities of correct sequence identification from LC-MS/MS, as well as the quality with which LC-MS features match AMT tags, to evaluate peptide identification confidence. Compared to existing tools, we are able to obtain significantly more high-confidence peptide identifications at a given false discovery rate and additionally assign confidence estimates to individual peptide identifications. Freely available software implementations of STAC are available in both command line and as a Windows graphical application.

  14. Enhanced statistical tests for GWAS in admixed populations: assessment using African Americans from CARe and a Breast Cancer Consortium.

    Directory of Open Access Journals (Sweden)

    Bogdan Pasaniuc

    2011-04-01

    Full Text Available While genome-wide association studies (GWAS have primarily examined populations of European ancestry, more recent studies often involve additional populations, including admixed populations such as African Americans and Latinos. In admixed populations, linkage disequilibrium (LD exists both at a fine scale in ancestral populations and at a coarse scale (admixture-LD due to chromosomal segments of distinct ancestry. Disease association statistics in admixed populations have previously considered SNP association (LD mapping or admixture association (mapping by admixture-LD, but not both. Here, we introduce a new statistical framework for combining SNP and admixture association in case-control studies, as well as methods for local ancestry-aware imputation. We illustrate the gain in statistical power achieved by these methods by analyzing data of 6,209 unrelated African Americans from the CARe project genotyped on the Affymetrix 6.0 chip, in conjunction with both simulated and real phenotypes, as well as by analyzing the FGFR2 locus using breast cancer GWAS data from 5,761 African-American women. We show that, at typed SNPs, our method yields an 8% increase in statistical power for finding disease risk loci compared to the power achieved by standard methods in case-control studies. At imputed SNPs, we observe an 11% increase in statistical power for mapping disease loci when our local ancestry-aware imputation framework and the new scoring statistic are jointly employed. Finally, we show that our method increases statistical power in regions harboring the causal SNP in the case when the causal SNP is untyped and cannot be imputed. Our methods and our publicly available software are broadly applicable to GWAS in admixed populations.

  15. Wind power potential assessment for three locations in Algeria

    Energy Technology Data Exchange (ETDEWEB)

    Himri, Y. [Electricity and Gas National Enterprise (Sonelgaz), Bechar (Algeria); Rehman, S. [Engineering Analysis Section, Center for Engineering Research, Research Institute, King Fahd University of Petroleum and Minerals, Box 767, Dhahran 31261 (Saudi Arabia); Draoui, B. [Department of Mechanical Engineering, University of Bechar (Algeria); Himri, S. [Department of Fundamental Sciences, University of Bechar (Algeria)

    2008-12-15

    This paper utilized wind speed data over a period of almost 10 years between 1977 and 1988 from three stations, namely Adrar, Timimoun and Tindouf to assess the wind power potential at these sites. The long-term annual mean wind speed values along with the wind turbine power curve values were used to estimate the annual energy output for a 30 MW installed capacity wind farm at each site. A total of 30 wind turbines each of 1000 kW rated power were used in the analysis. The long-term mean wind speed at Adrar, Timimoun and Tindouf was 5.9, 5.1 and 4.3 m/s at 10 m above ground level (AGL), respectively. Higher wind speeds were observed in the day time between 09:00 and 18:00 h and relatively smaller during rest of the period. Wind farms of 30 MW installed capacity at Adrar, Timimoun and Tindouf, if developed, could produce 98,832, 78,138 and 56,040 MWh of electricity annually taking into consideration the temperature and pressure adjustment coefficients of about 6% and all other losses of about 10%, respectively. The plant capacity factors at Adrar, Timimoun and Tindouf were found to be 38%, 30% and 21%, respectively. Finally, the cost of energy (COE) was found to be 3.1, 4.3 and 6.6 US cents/kWh at Adrar, Timimoun and Tindouf, respectively. It was noticed that such a development at these sites could result into avoidance of 48,577, 38,406 and 27,544 tons/year of CO{sub 2} equivalents green house gas (GHG) from entering into the local atmosphere, thus creating a clean and healthy atmosphere for local inhabitants. (author)

  16. Assessment of conducting polymer applications in power equipment technology

    Energy Technology Data Exchange (ETDEWEB)

    Schoch, K.F. Jr.; Bennett, A.I.; Burghardt, R.R.; Cookson, A.H.; Kennedy, W.N.; Oommen, T.V.; Saunders, H.E.; Smith, J.D.B.; Voshall, R.E. (Westinghouse Electric Corp., Pittsburgh, PA (USA)); Fort, E.M. (Westinghouse Electric Corp., Orlando, FL (USA)); Robbins, B. (Reynolds Metals Co., Richmond, VA (USA))

    1991-05-01

    This report describes for the first time the state-of-the-art in conducting polymer technology specifically relating to electric power apparatus for transmission, distribution and generation. Thirty-two new applications in power equipment are proposed and assessed. The areas of the proposed applications include solid dielectric cable, oil-filled cable, capacitors, transformers, rotating machines, bushings, surge supressors, vacuum interrupters, gas-insulated equipment, and miscellaneous applications. The best applications will result in improved reliability and efficiency, design innovations, and simpler manufacturing procedures by taking advantage of the particular characteristics of conducting polymers. These characteristics include good control of conductivity over a range of 10{sup {minus}8} to 10{sup 3} S/cm, compatibility with organic compounds, simple preparation and development of anisotropic conductivity by polymer orientation. The proposed applications were evaluated according to technical impact, probability of success, economic impact, and time frame for implementation. The state-of-the-art of conducting polymers is also reviewed and areas requiring further research for these applications are discussed. Because of substantial recent progress is developing more practical conducting polymer materials, now is an excellent time to pursue the additional research needed. 37 refs., 16 figs., 15 tabs.

  17. Water Quality Assessment of Gufu River in Three Gorges Reservoir (China Using Multivariable Statistical Methods

    Directory of Open Access Journals (Sweden)

    Jiwen Ge

    2013-07-01

    Full Text Available To provide the reasonable basis for scientific management of water resources and certain directive significance for sustaining health of Gufu River and even maintaining the stability of water ecosystem of the Three-Gorge Reservoir of Yangtze River, central China, multiple statistical methods including Cluster Analysis (CA, Discriminant Analysis (DA and Principal Component Analysis (PCA were performed to assess the spatial-temporal variations and interpret water quality data. The data were obtained during one year (2010~2011 of monitoring of 13 parameters at 21 different sites (3003 observations, Hierarchical CA classified 11 months into 2 periods (the first and second periods and 21 sampling sites into 2 clusters, namely, respectively upper reaches with little anthropogenic interference (UR and lower reaches running through the farming areas and towns that are subjected to some human interference (LR of the sites, based on similarities in the water quality characteristics. Eight significant parameters (total phosphorus, total nitrogen, temperature, nitrate nitrogen, total organic carbon, total hardness, total alkalinity and silicon dioxide were identified by DA, affording 100% correct assignations for temporal variation analysis, and five significant parameters (total phosphorus, total nitrogen, ammonia nitrogen, electrical conductivity and total organic carbon were confirmed with 88% correct assignations for spatial variation analysis. PCA (varimax functionality was applied to identify potential pollution sources based on the two clustered regions. Four Principal Components (PCs with 91.19 and 80.57% total variances were obtained for the Upper Reaches (UR and Lower Reaches (LR regions, respectively. For the UR region, the rainfall runoff, soil erosion, scouring weathering of crustal materials and forest areas are the main sources of pollution. The pollution sources for the LR region are anthropogenic sources (domestic and agricultural runoff

  18. Fighting bias with statistics: Detecting gender differences in responses to items on a preschool science assessment

    Science.gov (United States)

    Greenberg, Ariela Caren

    Differential item functioning (DIF) and differential distractor functioning (DDF) are methods used to screen for item bias (Camilli & Shepard, 1994; Penfield, 2008). Using an applied empirical example, this mixed-methods study examined the congruency and relationship of DIF and DDF methods in screening multiple-choice items. Data for Study I were drawn from item responses of 271 female and 236 male low-income children on a preschool science assessment. Item analyses employed a common statistical approach of the Mantel-Haenszel log-odds ratio (MH-LOR) to detect DIF in dichotomously scored items (Holland & Thayer, 1988), and extended the approach to identify DDF (Penfield, 2008). Findings demonstrated that the using MH-LOR to detect DIF and DDF supported the theoretical relationship that the magnitude and form of DIF and are dependent on the DDF effects, and demonstrated the advantages of studying DIF and DDF in multiple-choice items. A total of 4 items with DIF and DDF and 5 items with only DDF were detected. Study II incorporated an item content review, an important but often overlooked and under-published step of DIF and DDF studies (Camilli & Shepard). Interviews with 25 female and 22 male low-income preschool children and an expert review helped to interpret the DIF and DDF results and their comparison, and determined that a content review process of studied items can reveal reasons for potential item bias that are often congruent with the statistical results. Patterns emerged and are discussed in detail. The quantitative and qualitative analyses were conducted in an applied framework of examining the validity of the preschool science assessment scores for evaluating science programs serving low-income children, however, the techniques can be generalized for use with measures across various disciplines of research.

  19. Statistical characteristics of the observed Ly-α forest and the shape of initial power spectrum

    Science.gov (United States)

    Demiański, M.; Doroshkevich, A. G.; Turchaninov, V.

    2003-04-01

    Properties of approximately 4500 observed Ly α absorbers are investigated using the model of formation and evolution of dark matter (DM) structure elements based on the modified Zel'dovich theory. This model is generally consistent with simulations of absorber formation, describes the large-scale structure (LSS) observed in the galaxy distribution at small redshifts reasonably well and emphasizes the generic similarity of the LSS and absorbers. The simple physical model of absorbers asserts that they are composed of DM and gaseous matter. It allows us to estimate the column density and overdensity of DM and gaseous components and the entropy of the gas trapped within the DM potential wells. The parameters of the DM component are found to be consistent with theoretical expectations for the Gaussian initial perturbations with the warm dark matter-like power spectrum. The basic physical factors responsible for the evolution of the absorbers are discussed. The analysis of redshift distribution of absorbers confirms the self-consistency of the adopted physical model, Gaussianity of the initial perturbations and allows one to estimate the shape of the initial power spectrum at small scales that, in turn, restricts the mass of the dominant fraction of DM particles to MDM>= 1.5-5 keV. Our results indicate a possible redshift variations of intensity of the ultraviolet background by approximately a factor of 2-3 at redshifts z~ 2-3.

  20. Statistical modelling and power analysis for detecting trends in total suspended sediment loads

    Science.gov (United States)

    Wang, You-Gan; Wang, Shen S. J.; Dunlop, Jason

    2015-01-01

    The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature.

  1. A statistical analysis to assess the maturity and stability of six composts.

    Science.gov (United States)

    Komilis, Dimitrios P; Tziouvaras, Ioannis S

    2009-05-01

    Despite the long-time application of organic waste derived composts to crops, there is still no universally accepted index to assess compost maturity and stability. The research presented in this article investigated the suitability of seven types of seeds for use in germination bioassays to assess the maturity and phytotoxicity of six composts. The composts used in the study were derived from cow manure, sea weeds, olive pulp, poultry manure and municipal solid waste. The seeds used in the germination bioassays were radish, pepper, spinach, tomato, cress, cucumber and lettuce. Data were analyzed with an analysis of variance at two levels and with pair-wise comparisons. The analysis revealed that composts rendered as phytotoxic to one type of seed could enhance the growth of another type of seed. Therefore, germination indices, which ranged from 0% to 262%, were highly dependent on the type of seed used in the germination bioassay. The poultry manure compost was highly phytotoxic to all seeds. At the 99% confidence level, the type of seed and the interaction between the seeds and the composts were found to significantly affect germination. In addition, the stability of composts was assessed by their microbial respiration, which ranged from approximately 4 to 16g O(2)/kg organic matter and from 2.6 to approximately 11g CO(2)-C/kg C, after seven days. Initial average oxygen uptake rates were all less than approximately 0.35g O(2)/kg organic matter/h for all six composts. A high statistically significant correlation coefficient was calculated between the cumulative carbon dioxide production, over a 7-day period, and the radish seed germination index. It appears that a germination bioassay with radish can be a valid test to assess both compost stability and compost phytotoxicity.

  2. Statistical and regulatory considerations in assessments of interchangeability of biological drug products.

    Science.gov (United States)

    Tóthfalusi, Lászlo; Endrényi, László; Chow, Shein-Chung

    2014-05-01

    When the patent of a brand-name, marketed drug expires, new, generic products are usually offered. Small-molecule generic and originator drug products are expected to be chemically identical. Their pharmaceutical similarity can be typically assessed by simple regulatory criteria such as the expectation that the 90% confidence interval for the ratio of geometric means of some pharmacokinetic parameters be between 0.80 and 1.25. When such criteria are satisfied, the drug products are generally considered to exhibit therapeutic equivalence. They are then usually interchanged freely within individual patients. Biological drugs are complex proteins, for instance, because of their large size, intricate structure, sensitivity to environmental conditions, difficult manufacturing procedures, and the possibility of immunogenicity. Generic and brand-name biologic products can be expected to show only similarity but not identity in their various features and clinical effects. Consequently, the determination of biosimilarity is also a complicated process which involves assessment of the totality of the evidence for the close similarity of the two products. Moreover, even when biosimilarity has been established, it may not be assumed that the two biosimilar products can be automatically substituted by pharmacists. This generally requires additional, careful considerations. Without declaring interchangeability, a new product could be prescribed, i.e. it is prescribable. However, two products can be automatically substituted only if they are interchangeable. Interchangeability is a statistical term and it means that products can be used in any order in the same patient without considering the treatment history. The concepts of interchangeability and prescribability have been widely discussed in the past but only in relation to small molecule generics. In this paper we apply these concepts to biosimilars and we discuss: definitions of prescribability and interchangeability and

  3. Statistical characteristics of observed Ly-$\\alpha$ forest and the shape of linear power spectrum

    CERN Document Server

    Demianski, M

    2005-01-01

    Properties of $\\sim$ 6 000 Ly-$\\alpha$ absorbers observed in 19 high resolution spectra of QSOs are investigated using the model of formation and evolution of DM structure elements based on the Zel'dovich theory. This model asserts that absorbers are formed in the course of both linear and nonlinear adiabatic or shock compression of dark matter (DM) and gaseous matter. It allows us to link the column density and overdensity of DM and gaseous components with the observed column density of neutral hydrogen, redshifts and Doppler parameters of absorbers and demonstrates that at high redshifts we observe a self similar period of structure evolution with the Gaussian initial perturbations. We show that the colder absorbers are associated with rapidly expanded regions of a galactic scale which represent large amplitude negative density perturbations. We extend and improve the method of measuring the power spectrum of initial perturbations proposed in Demia\\'nski & Doroshkevich (2003b). Our method links the obse...

  4. The Power (Law) of Indian Markets: Analysing NSE and BSE trading statistics

    CERN Document Server

    Sinha, S; Sinha, Sitabhra; Pan, Raj Kumar

    2006-01-01

    The nature of fluctuations in the Indian financial market is analyzed in this paper. We have looked at the price returns of individual stocks, with tick-by-tick data from the National Stock Exchange (NSE) and daily closing price data from both NSE and the Bombay Stock Exchange (BSE), the two largest exchanges in India. We find that the price returns in Indian markets follow a fat-tailed cumulative distribution, consistent with a power law having exponent $\\alpha \\sim 3$, similar to that observed in developed markets. However, the distributions of trading volume and the number of trades have a different nature than that seen in the New York Stock Exchange (NYSE). Further, the price movement of different stocks are highly correlated in Indian markets.

  5. Statistical characteristics of observed Ly-$\\alpha$ forest and the shape of initial power spectrum

    CERN Document Server

    Demianski, M

    2002-01-01

    Properties of $\\sim$ 5000 observed Ly-$\\alpha$ absorbers are investigated using the model of formation and evolution of DM structure elements based on the Zel'dovich theory. This model is generally consistent with simulations of absorbers formation, accurately describes the Large Scale Structure observed in the galaxy distribution at small redshifts and emphasizes the generic similarity of the LSS and absorbers. The simple physical model of absorbers asserts that they are composed of DM and gaseous matter and it allows us to estimate the column density and overdensity of DM and gaseous components and the entropy of the gas trapped within the DM potential wells. The parameters of DM component are found to be consistent with theoretical expectations for the Gaussian initial perturbations with the WDM--like power spectrum. We demonstrate the influence of the main physical factors responsible for the absorbers evolution. The analysis of redshift distribution of absorbers confirms the self consistence of the assum...

  6. Wind power prognosis statistical system; Sistema estadistico de pronostico de la energia eoloelectrica

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Garcia, Alfredo; De la Torre Vega, Eli [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2009-07-01

    The integration of the first Aeolian farm of large scale (La Venta II) to the National Interconnected System requires taking into account the random and discontinuous nature of the Aeolian energy. An important tool, for this task, is a system for the prognosis of the Aeolian energy in the short term. For this reason, the Instituto of Investigaciones Electricas (IIE) developed a statistical model to realize this prognosis. The prediction is done through an adaptable linear combination of alternative competing models, where the weights given to each model are based on its more recent prognosis quality. Also, the application results of the prognoses system are presented and analyzed. [Spanish] La integracion de la primera grana eolica de gran escala (La Venta II) al Sistema Interconectado Nacional requiere tomar en cuenta la naturaleza aleatoria y discontinua de la energia eolica. Una importante herramienta, para esta tarea, es un sistema para el pronostico de la energia eolica a corto plazo. Por ello, el Instituto de Investigaciones Electricas (IIE) desarrollo un modelo estadistico para realizar este pronostico. La prediccion es hecha a traves de una combinacion lineal adaptable de modelos competidores alternativos, donde los pesos dados a cada modelo estan basados en su mas reciente calidad de pronostico. Tambien se presentan y analizan los resultados de la aplicacion del sistema de pronosticos.

  7. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    CERN Document Server

    Wheatley, Spencer; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...

  8. Probabilistic Tsunami Hazard Assessment for Nuclear Power Plants in Japan

    Science.gov (United States)

    Satake, K.

    2012-12-01

    Tsunami hazard assessments for nuclear power stations (NPS) in Japan had been conducted by a deterministic method, but probabilistic methods are being adopted following the accident of Fukushima Daiichi NPS. The deterministic tsunami hazard assessment (DTHA), proposed by Japan Society of Civil Engineers in 2002 (Yanagisawa et al., 2007, Pageoph) considers various uncertainties by parameter studies. The design tsunami height at Fukushima NPS was set as 6.1 m, based on parameter studies by varying location, depth, and strike, dip and slip angles of the 1938 off-Fukushima earthquake (M 7.4). The maximum tsunami height for a hypothetical "tsunami earthquake" off Fukushima, similar to the 1896 Sanriku earthquake (Mt 8.2), and that for the 869 Jogan earthquake model (Mw 8.4) were estimated as 15.7 m and 8.9 m, respectively, before the 2011 accident (TEPCO report, 2012). The actual tsunami height at the Fukushima NPS on March 11, 2011 was 12 to 16 m. A probabilistic tsunami hazard assessment (PTHA) has been also proposed by JSCE (An'naka et al., 2007, Pageoph), and recently adopted in "Implementation Standard of Tsunami Probabilistic Risk Assessment (PRA) of NPPs" published in 2012 by Atomic Energy Society of Japan. In PTHA, tsunami hazard curves, or probability of exeedance for tsunami heights, are constructed by integrating over aleatory uncertainties. The epistemic uncertainties are treated as branches of logic trees. The logic-tree branches for the earthquake source include the earthquake type, magnitude range, recurrence interval and the parameters of BPT distribution for the recurrent earthquakes. Because no "tsunami earthquake" was recorded off the Fukushima NPS, whether or not a "tsunami earthquake" occurs along the Japan trench off Fukushima, was a one of logic-tree branches, and the weight was determined by experts' opinions. Possibilities for multi-segment earthquakes are now added as logic-tree branches, after the 2011 Tohoku earthquake, which is considered as

  9. No-Reference Image Quality Assessment for ZY3 Imagery in Urban Areas Using Statistical Model

    Science.gov (United States)

    Zhang, Y.; Cui, W. H.; Yang, F.; Wu, Z. C.

    2016-06-01

    More and more high-spatial resolution satellite images are produced with the improvement of satellite technology. However, the quality of images is not always satisfactory for application. Due to the impact of complicated atmospheric conditions and complex radiation transmission process in imaging process the images often suffer deterioration. In order to assess the quality of remote sensing images over urban areas, we proposed a general purpose image quality assessment methods based on feature extraction and machine learning. We use two types of features in multi scales. One is from the shape of histogram the other is from the natural scene statistics based on Generalized Gaussian distribution (GGD). A 20-D feature vector for each scale is extracted and is assumed to capture the RS image quality degradation characteristics. We use SVM to learn to predict image quality scores from these features. In order to do the evaluation, we construct a median scale dataset for training and testing with subjects taking part in to give the human opinions of degraded images. We use ZY3 satellite images over Wuhan area (a city in China) to conduct experiments. Experimental results show the correlation of the predicted scores and the subjective perceptions.

  10. Using Saliency-Weighted Disparity Statistics for Objective Visual Comfort Assessment of Stereoscopic Images

    Science.gov (United States)

    Zhang, Wenlan; Luo, Ting; Jiang, Gangyi; Jiang, Qiuping; Ying, Hongwei; Lu, Jing

    2016-06-01

    Visual comfort assessment (VCA) for stereoscopic images is a particularly significant yet challenging task in 3D quality of experience research field. Although the subjective assessment given by human observers is known as the most reliable way to evaluate the experienced visual discomfort, it is time-consuming and non-systematic. Therefore, it is of great importance to develop objective VCA approaches that can faithfully predict the degree of visual discomfort as human beings do. In this paper, a novel two-stage objective VCA framework is proposed. The main contribution of this study is that the important visual attention mechanism of human visual system is incorporated for visual comfort-aware feature extraction. Specifically, in the first stage, we first construct an adaptive 3D visual saliency detection model to derive saliency map of a stereoscopic image, and then a set of saliency-weighted disparity statistics are computed and combined to form a single feature vector to represent a stereoscopic image in terms of visual comfort. In the second stage, a high dimensional feature vector is fused into a single visual comfort score by performing random forest algorithm. Experimental results on two benchmark databases confirm the superior performance of the proposed approach.

  11. PowerStaTim 1.0 – un nou program statistic de calcul a mărimii efectului și a puterii statistice

    Directory of Open Access Journals (Sweden)

    Florin A. Sava

    2008-01-01

    Full Text Available The present paper presents the main characteristics of a new software for computing effect size and statistical power indicators: PowerStaTim 1.0 (Maricuțoiu & Sava, 2007. The first part of the present paper presents the rationale for computing effect size and statistical power in psychological research. The second part of the article introduces the reader to the technical characteristics of PowerStaTim 1.0 and to the processing options of this software.

  12. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  13. Reporting characteristics of meta-analyses in orthodontics: methodological assessment and statistical recommendations.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E

    2014-02-01

    Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

  14. Statistical Power Law due to Reservoir Fluctuations and the Universal Thermostat Independence Principle

    Directory of Open Access Journals (Sweden)

    Tamás Sándor Biró

    2014-12-01

    Full Text Available Certain fluctuations in particle number, \\(n\\, at fixed total energy, \\(E\\, lead exactly to a cut-power law distribution in the one-particle energy, \\(\\omega\\, via the induced fluctuations in the phase-space volume ratio, \\(\\Omega_n(E-\\omega/\\Omega_n(E=(1-\\omega/E^n\\. The only parameters are \\(1/T=\\langle \\beta \\rangle=\\langle n \\rangle/E\\ and \\(q=1-1/\\langle n \\rangle + \\Delta n^2/\\langle n \\rangle^2\\. For the binomial distribution of \\(n\\ one obtains \\(q=1-1/k\\, for the negative binomial \\(q=1+1/(k+1\\. These results also represent an approximation for general particle number distributions in the reservoir up to second order in the canonical expansion \\(\\omega \\ll E\\. For general systems the average phase-space volume ratio \\(\\langle e^{S(E-\\omega}/e^{S(E}\\rangle\\ to second order delivers \\(q=1-1/C+\\Delta \\beta^2/\\langle \\beta \\rangle^2\\ with \\(\\beta=S^{\\prime}(E\\ and \\(C=dE/dT\\ heat capacity. However, \\(q \

  15. Towards powerful experimental and statistical approaches to study intraindividual variability in labile traits

    Science.gov (United States)

    Fanson, Benjamin G.; Beckmann, Christa; Biro, Peter A.

    2016-01-01

    There is a long-standing interest in behavioural ecology, exploring the causes and correlates of consistent individual differences in mean behavioural traits (‘personality’) and the response to the environment (‘plasticity’). Recently, it has been observed that individuals also consistently differ in their residual intraindividual variability (rIIV). This variation will probably have broad biological and methodological implications to the study of trait variation in labile traits, such as behaviour and physiology, though we currently need studies to quantify variation in rIIV, using more standardized and powerful methodology. Focusing on activity rates in guppies (Poecilia reticulata), we provide a model example, from sampling design to data analysis, in how to quantify rIIV in labile traits. Building on the doubly hierarchical generalized linear model recently used to quantify individual differences in rIIV, we extend the model to evaluate the covariance between individual mean values and their rIIV. After accounting for time-related change in behaviour, our guppies substantially differed in rIIV, and it was the active individuals that tended to be more consistent (lower rIIV). We provide annotated data analysis code to implement these complex models, and discuss how to further generalize the model to evaluate covariances with other aspects of phenotypic variation. PMID:27853550

  16. Statistical power to detect genetic (covariance of complex traits using SNP data in unrelated samples.

    Directory of Open Access Journals (Sweden)

    Peter M Visscher

    2014-04-01

    Full Text Available We have recently developed analysis methods (GREML to estimate the genetic variance of a complex trait/disease and the genetic correlation between two complex traits/diseases using genome-wide single nucleotide polymorphism (SNP data in unrelated individuals. Here we use analytical derivations and simulations to quantify the sampling variance of the estimate of the proportion of phenotypic variance captured by all SNPs for quantitative traits and case-control studies. We also derive the approximate sampling variance of the estimate of a genetic correlation in a bivariate analysis, when two complex traits are either measured on the same or different individuals. We show that the sampling variance is inversely proportional to the number of pairwise contrasts in the analysis and to the variance in SNP-derived genetic relationships. For bivariate analysis, the sampling variance of the genetic correlation additionally depends on the harmonic mean of the proportion of variance explained by the SNPs for the two traits and the genetic correlation between the traits, and depends on the phenotypic correlation when the traits are measured on the same individuals. We provide an online tool for calculating the power of detecting genetic (covariation using genome-wide SNP data. The new theory and online tool will be helpful to plan experimental designs to estimate the missing heritability that has not yet been fully revealed through genome-wide association studies, and to estimate the genetic overlap between complex traits (diseases in particular when the traits (diseases are not measured on the same samples.

  17. Insulation Diagnosis of Service Aged XLPE Power Cables Using Statistical Analysis and Fuzzy Inference

    Institute of Scientific and Technical Information of China (English)

    LIU Fei; JIANG Pingkai; LEI Qingquan; ZHANG Li; SU Wenqun

    2013-01-01

    Cables that have been in service for over 20 years in Shanghai,a city with abundant surface water,failed more frequently and induced different cable accidents.This necessitates researches on the insulation aging state of cables working in special circumstances.We performed multi-parameter tests with samples from about 300 cable lines in Shanghai.The tests included water tree investigation,tensile test,dielectric spectroscopy test,thermogravimetric analysis (TGA),fourier transform infrared spectroscopy (FTIR),and electrical aging test.Then,we carried out regression analysis between every two test parameters.Moreover,through two-sample t-Test and analysis of variance (ANOVA) of each test parameter,we analyzed the influences of cable-laying method and sampling section on the degradation of cable insulation respectively.Furthermore,the test parameters which have strong correlation in the regression analysis or significant differences in the t-Test or ANOVA analysis were determined to be the ones identifying the XLPE cable insulation aging state.The thresholds for distinguishing insulation aging states had been also obtained with the aid of statistical analysis and fuzzy clustering.Based on the fuzzy inference,we established a cable insulation aging diagnosis model using the intensity transfer method.The results of regression analysis indicate that the degradation of cable insulation accelerates as the degree of in-service aging increases.This validates the rule that the increase of microscopic imperfections in solid material enhances the dielectric breakdown strength.The results of the two-sample t-Test and the ANOVA indicate that the direct-buried cables are more sensitive to insulation degradation than duct cables.This confirms that the tensile strength and breakdown strength are reliable functional parameters in cable insulation evaluations.A case study further indicates that the proposed diagnosis model based on the fuzzy inference can reflect the comprehensive

  18. ASSESSMENT OF OIL PALM PLANTATION AND TROPICAL PEAT SWAMP FOREST WATER QUALITY BY MULTIVARIATE STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Seca Gandaseca

    2014-01-01

    Full Text Available This study reports the spatio-temporal changes in river and canal water quality of peat swamp forest and oil palm plantation sites of Sarawak, Malaysia. To investigate temporal changes, 192 water samples were collected at four stations of BatangIgan, an oil palm plantation site of Sarawak, during July-November in 2009 and April-July in 2010. Nine water quality parameters including Electrical Conductivity (EC, pH, Turbidity (TER, Dissolved Oxygen (DO, Temperature (TEMP, Chemical Oxygen Demand (COD, five-day Biochemical Oxygen Demand (BOD5, ammonia-Nitrogen (NH3-N, Total Suspended Solids (TSS were analysed. To investigate spatial changes, 432water samples were collected from six different sites including BatangIgan during June-August 2010. Six water quality parameters including pH, DO, COD, BOD5, NH3-N and TSS were analysed to see the spatial variations. Most significant parameters which contributed in spatio-temporal variations were assessed by statistical techniques such as Hierarchical Agglomerative Cluster Analysis (HACA, Factor Analysis/Principal Components Analysis (FA/PCA and Discriminant Function Analysis (DFA. HACA identified three different classes of sites: Relatively Unimpaired, Impaired and Less Impaired Regions on the basis of similarity among different physicochemical characteristics and pollutant level between the sampling sites. DFA produced the best results for identification of main variables for temporal analysis and separated parameters (EC, TER, COD and identified three parameters for spatial analysis (pH, NH3-N and BOD5. The results signify that parameters identified by statistical analyses were responsible for water quality change and suggest the possibility the agricultural and oil palm plantation activities as a source of pollutants. The results suggest dire need for proper watershed management measures to restore the water quality of this tributary for a

  19. Assessment of Problem-Based Learning in the Undergraduate Statistics Course

    Science.gov (United States)

    Karpiak, Christie P.

    2011-01-01

    Undergraduate psychology majors (N = 51) at a mid-sized private university took a statistics examination on the first day of the research methods course, a course for which a grade of "C" or higher in statistics is a prerequisite. Students who had taken a problem-based learning (PBL) section of the statistics course (n = 15) were compared to those…

  20. Validation of seismic probabilistic risk assessments of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  1. Assessment of Feeder Wall Thinning of Wolsong Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Han Sub [KEPRI, Daejeon (Korea, Republic of)

    2010-05-15

    The reactor of CANDUs of Wolsong Nuclear Power generating station is composed of 380 pressure tubes. The primary heat transport circuit of CANDU connects each pressure tube to headers on the way to and from the steam generators. The feeder is A-106 carbon steel, and suffers from wall thinning by Flow Accelerated Corrosion. Excessive thinning deteriorates the pressure retaining capability of piping so that the minimum allowable thickness of feeder should be maintained throughout the life of feeder. The feeder wall thinning should be monitored by in-service inspection. Knowledge-based inspection strategy needs to be developed since combination of high radiation field and geometric restriction near the tight bend location makes extensive inspection very difficult. A thermo hydraulic assessment using computational fluid dynamics software and feeder wall thinning simulation experiments using plaster of Paris may provide valuable information to understand characteristic features of the feeder wall thinning. Plant in-service inspection database may be another source of valuable information. This paper summarizes a review of feeder wall thinning in Wolsong CANDU station. W-1 feeder suffered significant thinning so that it is being replaced along with the plant refurbishment campaign. The other units, W-2approx4, are still in the early portion of their operation life. A result of feeder wall thinning simulation test using plaster of Paris is presented. The knowledge presented in this paper is important information to design a knowledge-based in-service inspection program of feeder wall thinning

  2. Wind power in Eritrea, Africa: A preliminary resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, K.; Rosen, K. [San Jose State Univ., CA (United States); Van Buskirk, R. [Dept. of Energy, Eritrea (Ethiopia)

    1997-12-31

    The authors preliminary assessment of Eritrean wind energy potential identified two promising regions: (1) the southeastern Red Sea coast and (2) the mountain passes that channel winds between the coastal lowlands and the interior highlands. The coastal site, near the port city of Aseb, has an exceptionally good resource, with estimated average annual wind speeds at 10-m height above 9 m/s at the airport and 7 m/s in the port. Furthermore, the southern 200 km of coastline has offshore WS{sub aa} > 6 m/s. This area has strong potential for development, having a local 20 MW grid and unmet demand for the fishing industry and development. Although the highland sites contain only marginal wind resources ({approximately} 5 m/s), they warrant further investigation because of their proximity to the capital city, Asmera, which has the largest unmet demand and a larger power grid (40 MW with an additional 80 MW planned) to absorb an intermittent source without storage.

  3. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    Science.gov (United States)

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  4. Application of multivariate statistical technique for hydrogeochemical assessment of groundwater within the Lower Pra Basin, Ghana

    Science.gov (United States)

    Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.

    2017-02-01

    Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal

  5. First Aspect of Conventional Power System Assessment for High Wind Power Plants Penetration

    Directory of Open Access Journals (Sweden)

    A Merzic

    2012-11-01

    Full Text Available Most power systems in underdeveloped and developing countries are based on conventional power plants, mainly "slow-response" thermal power plants and a certain number of hydro power plants; characterized by inflexible generating portfolios and traditionally designed to meet own electricity needs. Taking into account operational capabilities of conventional power systems, their development planning will face problems with integration of notable amounts of installed capacities in wind power plants (WPP. This is what highlights the purpose of this work and in that sense, here, possible variations of simulated output power from WPP in the 10 minute and hourly time interval, which need to be balanced, are investigated, presented and discussed. Comparative calculations for the amount of installed power in WPP that can be integrated into a certain power system, according to available secondary balancing power amounts, in case of concentrated and dispersed future WPP are given. The stated has been done using a part of the power system of Bosnia and Herzegovina. In the considered example, by planned geographically distributed WPP construction, even up to cca. 74% more in installed power of WPP can be integrated into the power system than in case of geographically concentrated WPP construction, for the same available amount of (secondary balancing power. These calculations have shown a significant benefit of planned, geographically distributed WPP construction, as an important recommendation for the development planning of conventional power systems, with limited balancing options. Keywords: balancing reserves,  geographical dispersion, output power  variations

  6. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  7. A Fractional Lower Order Statistics-Based MIMO Detection Method in Impulse Noise for Power Line Channel

    Directory of Open Access Journals (Sweden)

    CHEN, Z.

    2014-11-01

    Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.

  8. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  9. Climatic change of summer temperature and precipitation in the Alpine region - a statistical-dynamical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Heimann, D.; Sept, V.

    1998-12-01

    Climatic changes in the Alpine region due to increasing greenhouse gas concentrations are assessed by using statistical-dynamical downscaling. The downscaling procedure is applied to two 30-year periods (1971-2000 and 2071-2100, summer months only) of the output of a transient coupled ocean/atmosphere climate scenario simulation. The downscaling results for the present-day climate are in sufficient agreement with observations. The estimated regional climate change during the next 100 years shows a general warming. The mean summer temperatures increase by about 3 to more than 5 Kelvin. The most intense climatic warming is predicted in the western parts of the Alps. The amount of summer precipitation decreases in most parts of central Europe by more than 20 percent. Only over the Adriatic area and parts of eastern central Europe an increase in precipitation is simulated. The results are compared with observed trends and results of regional climate change simulations of other authors. The observed trends and the majority of the simulated trends agree with our results. However, there are also climate change estimates which completely contradict with ours. (orig.) 29 refs.

  10. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  11. Assessing pneumococcal meningitis association with viral respiratory infections and antibiotics: insights from statistical and mathematical models.

    Science.gov (United States)

    Opatowski, Lulla; Varon, Emmanuelle; Dupont, Claire; Temime, Laura; van der Werf, Sylvie; Gutmann, Laurent; Boëlle, Pierre-Yves; Watier, Laurence; Guillemot, Didier

    2013-08-01

    Pneumococcus is an important human pathogen, highly antibiotic resistant and a major cause of bacterial meningitis worldwide. Better prevention requires understanding the drivers of pneumococcal infection incidence and antibiotic susceptibility. Although respiratory viruses (including influenza) have been suggested to influence pneumococcal infections, the underlying mechanisms are still unknown, and viruses are rarely considered when studying pneumococcus epidemiology. Here, we propose a novel mathematical model to examine hypothetical relationships between Streptococcus pneumoniae meningitis incidence (SPMI), acute viral respiratory infections (AVRIs) and antibiotic exposure. French time series of SPMI, AVRI and penicillin consumption over 2001-2004 are analysed and used to assess four distinct virus-bacteria interaction submodels, ascribing the interaction on pneumococcus transmissibility and/or pathogenicity. The statistical analysis reveals strong associations between time series: SPMI increases shortly after AVRI incidence and decreases overall as the antibiotic-prescription rate rises. Model simulations require a combined impact of AVRI on both pneumococcal transmissibility (up to 1.3-fold increase at the population level) and pathogenicity (up to threefold increase) to reproduce the data accurately, along with diminished epidemic fitness of resistant pneumococcal strains causing meningitis (0.97 (0.96-0.97)). Overall, our findings suggest that AVRI and antibiotics strongly influence SPMI trends. Consequently, vaccination protecting against respiratory virus could have unexpected benefits to limit invasive pneumococcal infections.

  12. Multivariate statistical approach for the assessment of groundwater quality in Ujjain City, India.

    Science.gov (United States)

    Vishwakarma, Vikas; Thakur, Lokendra Singh

    2012-10-01

    Groundwater quality assessment is an essential study which plays important role in the rational development and utilization of groundwater. Groundwater quality greatly influences the health of local people. The variations of water quality are essentially the combination of both anthropogenic and natural contributions. In order to understand the underlying physical and chemical processes this study analyzes 8 chemical and physical-chemical water quality parameters, viz. pH, turbidity, electrical conductivity, total dissolved solids, total alkalinity, total hardness, chloride and fluoride recorded at the 54 sampling stations during summer season of 2011 by using multivariate statistical techniques. Hierarchical clustering analysis (CA) is first applied to distinguish groundwater quality patterns among the stations, followed by the use of principle component analysis (PCA) and factor analysis (FA) to extract and recognize the major underlying factors contributing to the variations among the water quality measures. The first three components were chosen for interpretation of the data, which accounts for 72.502% of the total variance in the data set. The maximum number of variables, i.e. turbidity, EC, TDS and chloride were characterized by first component, while second and third were characterized by total alkalinity, total hardness, fluoride and pH respectively. This shows that hydro chemical constituents of the groundwater are mainly controlled by EC, TDS, and fluoride. The findings of the cluster analysis are presented in the form of dendrogram of the sampling stations (cases) as well as hydro chemical variables, which produced four major groupings, suggest that groundwater monitoring can be consolidated.

  13. Blind image quality assessment: a natural scene statistics approach in the DCT domain.

    Science.gov (United States)

    Saad, Michele A; Bovik, Alan C; Charrier, Christophe

    2012-08-01

    We develop an efficient, general-purpose, blind/noreference image quality assessment (NR-IQA) algorithm using a natural scene statistics (NSS) model of discrete cosine transform (DCT) coefficients. The algorithm is computationally appealing, given the availability of platforms optimized for DCT computation. The approach relies on a simple Bayesian inference model to predict image quality scores given certain extracted features. The features are based on an NSS model of the image DCT coefficients. The estimated parameters of the model are utilized to form features that are indicative of perceptual quality. These features are used in a simple Bayesian inference approach to predict quality scores. The resulting algorithm, which we name BLIINDS-II, requires minimal training and adopts a simple probabilistic model for score prediction. Given the extracted features from a test image, the quality score that maximizes the probability of the empirically determined inference model is chosen as the predicted quality score of that image. When tested on the LIVE IQA database, BLIINDS-II is shown to correlate highly with human judgments of quality, at a level that is competitive with the popular SSIM index.

  14. Continuation Power Flow Method based Assessment of Static Voltage Stability considering the Power System Contingencies

    OpenAIRE

    Khan Aafreen; Tiwari Prasad Shankarshan

    2014-01-01

    Power system security is recognized as one of the major problems in many power systems throughout the world. Power system insecurity such as transmission lines being overloaded causes transmission elements cascade outages, which may lead to complete blackout. In accordance with these reasons, the prediction and recognition of voltage instability in power system has particular importance and it makes the network security stronger. This work, by considering the power system continge...

  15. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies

    Science.gov (United States)

    Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert

    2016-11-01

    The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.

  16. Power-law dynamics in an auditory-nerve model can account for neural adaptation to sound-level statistics.

    Science.gov (United States)

    Zilany, Muhammad S A; Carney, Laurel H

    2010-08-04

    Neurons in the auditory system respond to recent stimulus-level history by adapting their response functions according to the statistics of the stimulus, partially alleviating the so-called "dynamic-range problem." However, the mechanism and source of this adaptation along the auditory pathway remain unknown. Inclusion of power-law dynamics in a phenomenological model of the inner hair cell (IHC)-auditory nerve (AN) synapse successfully explained neural adaptation to sound-level statistics, including the time course of adaptation of the mean firing rate and changes in the dynamic range observed in AN responses. A direct comparison between model responses to a dynamic stimulus and to an "inversely gated" static background suggested that AN dynamic-range adaptation largely results from the adaptation produced by the response history. These results support the hypothesis that the potential mechanism underlying the dynamic-range adaptation observed at the level of the auditory nerve is located peripheral to the spike generation mechanism and central to the IHC receptor potential.

  17. Assessing the utility of statistical adjustments for imperfect detection in tropical conservation science.

    Science.gov (United States)

    Banks-Leite, Cristina; Pardini, Renata; Boscolo, Danilo; Cassano, Camila Righetto; Püttker, Thomas; Barros, Camila Santos; Barlow, Jos

    2014-08-01

    1. In recent years, there has been a fast development of models that adjust for imperfect detection. These models have revolutionized the analysis of field data, and their use has repeatedly demonstrated the importance of sampling design and data quality. There are, however, several practical limitations associated with the use of detectability models which restrict their relevance to tropical conservation science. 2. We outline the main advantages of detectability models, before examining their limitations associated with their applicability to the analysis of tropical communities, rare species and large-scale data sets. Finally, we discuss whether detection probability needs to be controlled before and/or after data collection. 3. Models that adjust for imperfect detection allow ecologists to assess data quality by estimating uncertainty and to obtain adjusted ecological estimates of populations and communities. Importantly, these models have allowed informed decisions to be made about the conservation and management of target species. 4. Data requirements for obtaining unadjusted estimates are substantially lower than for detectability-adjusted estimates, which require relatively high detection/recapture probabilities and a number of repeated surveys at each location. These requirements can be difficult to meet in large-scale environmental studies where high levels of spatial replication are needed, or in the tropics where communities are composed of many naturally rare species. However, while imperfect detection can only be adjusted statistically, covariates of detection probability can also be controlled through study design. Using three study cases where we controlled for covariates of detection probability through sampling design, we show that the variation in unadjusted ecological estimates from nearly 100 species was qualitatively the same as that obtained from adjusted estimates. Finally, we discuss that the decision as to whether one should control for

  18. Direct integration of intensity-level data from Affymetrix and Illumina microarrays improves statistical power for robust reanalysis

    Directory of Open Access Journals (Sweden)

    Turnbull Arran K

    2012-08-01

    Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.

  19. A Participatory Approach to Develop the Power Mobility Screening Tool and the Power Mobility Clinical Driving Assessment Tool

    Directory of Open Access Journals (Sweden)

    Deepan C. Kamaraj

    2014-01-01

    Full Text Available The electric powered wheelchair (EPW is an indispensable assistive device that increases participation among individuals with disabilities. However, due to lack of standardized assessment tools, developing evidence based training protocols for EPW users to improve driving skills has been a challenge. In this study, we adopt the principles of participatory research and employ qualitative methods to develop the Power Mobility Screening Tool (PMST and Power Mobility Clinical Driving Assessment (PMCDA. Qualitative data from professional experts and expert EPW users who participated in a focus group and a discussion forum were used to establish content validity of the PMCDA and the PMST. These tools collectively could assess a user’s current level of bodily function and their current EPW driving capacity. Further multicenter studies are necessary to evaluate the psychometric properties of these tests and develop EPW driving training protocols based on these assessment tools.

  20. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment.

    Science.gov (United States)

    Henriksson, Patrik J G; Rico, Andreu; Zhang, Wenbo; Ahmad-Al-Nahid, Sk; Newton, Richard; Phan, Lam T; Zhang, Zongfeng; Jaithiang, Jintana; Dao, Hai M; Phu, Tran M; Little, David C; Murray, Francis J; Satapornvanit, Kriengkrai; Liu, Liping; Liu, Qigen; Haque, M Mahfujul; Kruijssen, Froukje; de Snoo, Geert R; Heijungs, Reinout; van Bodegom, Peter M; Guinée, Jeroen B

    2015-12-15

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aquatic food products. Our starting hypothesis was that different production systems are associated with significantly different environmental impacts, as the production of these aquatic species differs in intensity and management practices. In order to test this hypothesis, we estimated each system's global warming, eutrophication, and freshwater ecotoxicity impacts. The contribution to these impacts and the overall dispersions relative to results were propagated by Monte Carlo simulations and dependent sampling. Paired testing showed significant (p production systems in the intraspecies comparisons, even after a Bonferroni correction. For the full distributions instead of only the median, only for Asian tiger shrimp did more than 95% of the propagated Monte Carlo results favor certain farming systems. The major environmental hot-spots driving the differences in environmental performance among systems were fishmeal from mixed fisheries for global warming, pond runoff and sediment discards for eutrophication, and agricultural pesticides, metals, benzalkonium chloride, and other chlorine-releasing compounds for freshwater ecotoxicity. The Asian aquaculture industry should therefore strive toward farming systems relying upon pelleted species-specific feeds, where the fishmeal inclusion is limited and sourced sustainably. Also, excessive nutrients should be recycled in integrated organic agriculture together with efficient aeration solutions powered by renewable energy sources.

  1. Assessment of Environmental External Effects in Power Generation

    DEFF Research Database (Denmark)

    Meyer, Henrik Jacob; Morthorst, Poul Erik; Ibsen, Liselotte Schleisner

    1996-01-01

    to the production of electricity based on a coal fired conventional plant. In the second case heat/power generation by means of a combined heat and power plant based on biomass-generated gas is compared to that of a combined heat and power plant fuelled by natural gas.In the report the individual externalities from...... technologies. The report compares environmental externalities in the production of energy using renewable and non-renewable energy sources, respectively. The comparison is demonstrated on two specific case studies. The first case is the production of electricity based on wind power plants compared...

  2. Toward a No-Reference Image Quality Assessment Using Statistics of Perceptual Color Descriptors.

    Science.gov (United States)

    Lee, Dohyoung; Plataniotis, Konstantinos N

    2016-08-01

    Analysis of the statistical properties of natural images has played a vital role in the design of no-reference (NR) image quality assessment (IQA) techniques. In this paper, we propose parametric models describing the general characteristics of chromatic data in natural images. They provide informative cues for quantifying visual discomfort caused by the presence of chromatic image distortions. The established models capture the correlation of chromatic data between spatially adjacent pixels by means of color invariance descriptors. The use of color invariance descriptors is inspired by their relevance to visual perception, since they provide less sensitive descriptions of image scenes against viewing geometry and illumination variations than luminances. In order to approximate the visual quality perception of chromatic distortions, we devise four parametric models derived from invariance descriptors representing independent aspects of color perception: 1) hue; 2) saturation; 3) opponent angle; and 4) spherical angle. The practical utility of the proposed models is examined by deploying them in our new general-purpose NR IQA metric. The metric initially estimates the parameters of the proposed chromatic models from an input image to constitute a collection of quality-aware features (QAF). Thereafter, a machine learning technique is applied to predict visual quality given a set of extracted QAFs. Experimentation performed on large-scale image databases demonstrates that the proposed metric correlates well with the provided subjective ratings of image quality over commonly encountered achromatic and chromatic distortions, indicating that it can be deployed on a wide variety of color image processing problems as a generalized IQA solution.

  3. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must al

  4. Assessing the Lexico-Grammatical Characteristics of a Corpus of College-Level Statistics Textbooks: Implications for Instruction and Practice

    Science.gov (United States)

    Wagler, Amy E.; Lesser, Lawrence M.; González, Ariel I.; Leal, Luis

    2015-01-01

    A corpus of current editions of statistics textbooks was assessed to compare aspects and levels of readability for the topics of "measures of center," "line of fit," "regression analysis," and "regression inference." Analysis with lexical software of these text selections revealed that the large corpus can…

  5. Assessing Colour-dependent Occupation Statistics Inferred from Galaxy Group Catalogues

    CERN Document Server

    Campbell, Duncan; Hearin, Andrew; Padmanabhan, Nikhil; Berlind, Andreas; Mo, H J; Tinker, Jeremy; Yang, Xiaohu

    2015-01-01

    We investigate the ability of current implementations of galaxy group finders to recover colour-dependent halo occupation statistics. To test the fidelity of group catalogue inferred statistics, we run three different group finders used in the literature over a mock that includes galaxy colours in a realistic manner. Overall, the resulting mock group catalogues are remarkably similar, and most colour-dependent statistics are recovered with reasonable accuracy. However, it is also clear that certain systematic errors arise as a consequence of correlated errors in group membership determination, central/satellite designation, and halo mass assignment. We introduce a new statistic, the halo transition probability (HTP), which captures the combined impact of all these errors. As a rule of thumb, errors tend to equalize the properties of distinct galaxy populations (i.e. red vs. blue galaxies or centrals vs. satellites), and to result in inferred occupation statistics that are more accurate for red galaxies than f...

  6. Power quality assessment using an adaptive neural network

    Energy Technology Data Exchange (ETDEWEB)

    Dash, P.K.; Swain, D.P.; Mishra, B.R. [Regional Engineering Coll., Rourkela (India). Dept. of Electrical Engineering; Rahman, S. [Virginia Polytechnic Inst. and State Univ., Blacksburg, VA (United States)

    1995-12-31

    The paper presents an adaptive neural network approach for the estimation of harmonic components of a power system and the power quality. The neural estimator is based on the use of an adaptive perceptron consisting of a linear adaptive neuron called Adaline. The learning parameters in the proposed algorithm are adjusted to force the error between the actual and desired outputs to satisfy a stable difference error equation. The estimator tracks the Fourier coefficients of the signal data corrupted with noise and decaying dc components very accurately. Adaptive tracking of harmonic components of a power system can easily be done using this algorithm. Several numerical tests have been conducted for the adaptive estimation of harmonic components, total harmonic distortion and power quality of power system signals mixed with noise and decaying dc components.

  7. Development on Vulnerability Assessment Methods of PPS of Nuclear Power Plants

    Institute of Scientific and Technical Information of China (English)

    MIAO; Qiang; ZHANG; Wen-liang; ZONG; Bo; BU; Li-xin; YIN; Hong-he; FANG; Xin

    2012-01-01

    <正>We present a set of vulnerability assessment methods of physical protection system (PPS) of nuclear power plants after investigating and collecting the experience of assessment in China. The methods have important significance to strengthen and upgrade the security of the nuclear power plants, and also to

  8. 75 FR 30013 - South Feather Water and Power Agency; Notice of Availability of Environmental Assessment

    Science.gov (United States)

    2010-05-28

    ... Energy Regulatory Commission South Feather Water and Power Agency; Notice of Availability of...), Commission staff has prepared an environmental assessment (EA) regarding South Feather Water and Power Agency... Creek development of the South Feather Power Project (FERC No. 2088). Sly Creek is located on Sly...

  9. Condition assessment of power cables using partial discharge diagnosis at damped AC voltages

    NARCIS (Netherlands)

    Wester, F.J.

    2004-01-01

    The thesis focuses on the condition assessment of the distribution power cables, which have a very critical part in the distribution of electrical power over regional distances. The majority of the outages in the power system is related to the distribution cables, of which for more than 60% to inter

  10. Assessing colour-dependent occupation statistics inferred from galaxy group catalogues

    Science.gov (United States)

    Campbell, Duncan; van den Bosch, Frank C.; Hearin, Andrew; Padmanabhan, Nikhil; Berlind, Andreas; Mo, H. J.; Tinker, Jeremy; Yang, Xiaohu

    2015-09-01

    We investigate the ability of current implementations of galaxy group finders to recover colour-dependent halo occupation statistics. To test the fidelity of group catalogue inferred statistics, we run three different group finders used in the literature over a mock that includes galaxy colours in a realistic manner. Overall, the resulting mock group catalogues are remarkably similar, and most colour-dependent statistics are recovered with reasonable accuracy. However, it is also clear that certain systematic errors arise as a consequence of correlated errors in group membership determination, central/satellite designation, and halo mass assignment. We introduce a new statistic, the halo transition probability (HTP), which captures the combined impact of all these errors. As a rule of thumb, errors tend to equalize the properties of distinct galaxy populations (i.e. red versus blue galaxies or centrals versus satellites), and to result in inferred occupation statistics that are more accurate for red galaxies than for blue galaxies. A statistic that is particularly poorly recovered from the group catalogues is the red fraction of central galaxies as a function of halo mass. Group finders do a good job in recovering galactic conformity, but also have a tendency to introduce weak conformity when none is present. We conclude that proper inference of colour-dependent statistics from group catalogues is best achieved using forward modelling (i.e. running group finders over mock data) or by implementing a correction scheme based on the HTP, as long as the latter is not too strongly model dependent.

  11. Environmental Impact Assessment for Olkiluoto 4 Nuclear Power Plant Unit in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Dersten, Riitta; Gahmberg, Sini; Takala, Jenni [Teollisuuden Voima Oyj, Olkiluoto, FI-27160 Eurajoki (Finland)

    2008-07-01

    In order to improve its readiness for constructing additional production capacity, Teollisuuden Voima Oyj (TVO) initiated in spring 2007 the environmental impact assessment procedure (EIA procedure) concerning a new nuclear power plant unit that would possibly be located at Olkiluoto. When assessing the environmental impacts of the Olkiluoto nuclear power plant extension project, the present state of the environment was first examined, and after that, the changes caused by the projects as well as their significance were assessed, taking into account the combined impacts of the operations at Olkiluoto. The environmental impact assessment for the planned nuclear power plant unit covers the entire life cycle of the plant unit. (authors)

  12. Expert judgment-based risk assessment using statistical scenario analysis: a case study-running the bulls in Pamplona (Spain).

    Science.gov (United States)

    Mallor, Fermín; García-Olaverri, Carmen; Gómez-Elvira, Sagrario; Mateo-Collazas, Pedro

    2008-08-01

    In this article, we present a methodology to assess the risk incurred by a participant in an activity involving danger of injury. The lack of high-quality historical data for the case considered prevented us from constructing a sufficiently detailed statistical model. It was therefore decided to generate a risk assessment model based on expert judgment. The methodology is illustrated in a real case context: the assessment of risk to participants in a San Fermin bull-run in Pamplona (Spain). The members of the panel of "experts on the bull-run" represented very different perspectives on the phenomenon: runners, surgeons and other health care personnel, journalists, civil defense workers, security staff, organizers, herdsmen, authors of books on the bull-run, etc. We consulted 55 experts. Our methodology includes the design of a survey instrument to elicit the experts' views and the statistical and mathematical procedures used to aggregate their subjective opinions.

  13. THE APPLICATION OF STATISTICAL PARAMETERS OF PHASE RESOLVED PD DISTRIBUTION TO AGING EXTENT ASSESSMENT OF LARGE GENERATOR INSULATION

    Institute of Scientific and Technical Information of China (English)

    谢恒堃; 乐波; 孙翔; 宋建成

    2003-01-01

    Objective To investigate the characteristic parameters employed to describe the aging extent of stator insulation of large generator and study the aging laws. Methods Multi-stress aging tests of model generator stator bar specimens were performed and PD measurements were conducted using digital PD detector with frequency range from 40*!kHz to 400*!kHz at different aging stage. Results From the test results of model specimens it was found that the skewness of phase resolved PD distribution might be taken as the characterization parameters for aging extent assessment of generator insulation. Furthermore, the measurement results of actual generator stator bars showed that the method based on statistical parameters of PD distributions are prospective for aging extent assessment and residual lifetime estimation of large generator insulation. Conclusion Statistical parameters of phase resolved PD distribution was proposed for aging extent assessment of large generator insulation.

  14. Energy Storage for Power Systems Applications: A Regional Assessment for the Northwest Power Pool (NWPP)

    Energy Technology Data Exchange (ETDEWEB)

    Kintner-Meyer, Michael CW; Balducci, Patrick J.; Jin, Chunlian; Nguyen, Tony B.; Elizondo, Marcelo A.; Viswanathan, Vilayanur V.; Guo, Xinxin; Tuffner, Francis K.

    2010-04-01

    Wind production, which has expanded rapidly in recent years, could be an important element in the future efficient management of the electric power system; however, wind energy generation is uncontrollable and intermittent in nature. Thus, while wind power represents a significant opportunity to the Bonneville Power Administration (BPA), integrating high levels of wind resources into the power system will bring great challenges to generation scheduling and in the provision of ancillary services. This report addresses several key questions in the broader discussion on the integration of renewable energy resources in the Pacific Northwest power grid. More specifically, it addresses the following questions: a) how much total reserve or balancing requirements are necessary to accommodate the simulated expansion of intermittent renewable energy resources during the 2019 time horizon, and b) what are the most cost effective technological solutions for meeting load balancing requirements in the Northwest Power Pool (NWPP).

  15. Continuation Power Flow Method based Assessment of Static Voltage Stability considering the Power System Contingencies

    Directory of Open Access Journals (Sweden)

    Khan Aafreen

    2014-11-01

    Full Text Available Power system security is recognized as one of the major problems in many power systems throughout the world. Power system insecurity such as transmission lines being overloaded causes transmission elements cascade outages, which may lead to complete blackout. In accordance with these reasons, the prediction and recognition of voltage instability in power system has particular importance and it makes the network security stronger. This work, by considering the power system contingencies based on the effects of them on Mega Watt Margin (MWM and maximum loading point (MLP is focused to analyse the voltage stability using continuation power flow method. The study has been carried out on IEEE 30-Bus Test System using MATLAB and PSAT softwares and results are presented.

  16. Safety Assessment of PowerBeam Flywheel Technology

    Energy Technology Data Exchange (ETDEWEB)

    Starbuck, J Michael [ORNL; Hansen, James Gerald [ORNL

    2009-11-01

    The greatest technical challenge facing the developer of vehicular flywheel systems is the issue of safety. The PowerBeam flywheel system concept, developed by HyKinesys Inc., employs a pair of high aspect ratio, counter-rotating flywheels to provide surge power for hybrid vehicle applications. The PowerBeam approach to safety is to design flywheels conservatively so as to avoid full rotor burst failure modes. A conservative point design was sized for use in a mid-size sedan such as a Chevrolet Malibu. The PowerBeam rotor rims were designed with a steel tube covered by a carbon fiber reinforced composite tube. ORNL conducted rotor design analyses using both nested ring and finite element analysis design codes. The safety factor of the composite material was 7, while that of the steel was greater than 3. The design exceeded the PNGV recommendation for a safety factor of at least 4 for composite material to prevent flywheel burst.

  17. Space Station Freedom power - A reliability, availability, and maintainability assessment of the proposed Space Station Freedom electric power system

    Science.gov (United States)

    Turnquist, S. R.; Twombly, M.; Hoffman, D.

    1989-01-01

    A preliminary reliability, availability, and maintainability (RAM) analysis of the proposed Space Station Freedom electric power system (EPS) was performed using the unit reliability, availability, and maintainability (UNIRAM) analysis methodology. Orbital replacement units (ORUs) having the most significant impact on EPS availability measures were identified. Also, the sensitivity of the EPS to variations in ORU RAM data was evaluated for each ORU. Estimates were made of average EPS power output levels and availability of power to the core area of the space station. The results of assessments of the availability of EPS power and power to load distribution points in the space stations are given. Some highlights of continuing studies being performed to understand EPS availability considerations are presented.

  18. Environmental Assessment: Gulf Power Company Military Point Transmission Line Project

    Science.gov (United States)

    2014-05-12

    medium-sized mammals , a variety of reptilian and amphibian species, and a large diversity of water and terrestrial bird species (Tyndall AFB 1999...John Dingwall Project Manager 325th Civil Engineer Squadron 119 Alabama Avenue People Serving Peopfeu Re: Gulf Power Company Military Point...Force 3251h Civil Engineer Squadron 119 Alabama Avenue Tyndall AFB, FL 32403-5014 Dear Mr. Dingwall: I have revieweKI the proposal for Gulf Power

  19. Assessment of flywheel energy storage for spacecraft power systems

    Science.gov (United States)

    Rodriguez, G. E.; Studer, P. A.; Baer, D. A.

    1983-01-01

    The feasibility of inertial energy storage in a spacecraft power system is evaluated on the basis of a conceptual integrated design that encompasses a composite rotor, magnetic suspension, and a permanent magnet (PM) motor/generator for a 3-kW orbital average payload at a bus distribution voltage of 250 volts dc. The conceptual design, which evolved at the Goddard Space Flight Center (GSFC), is referred to as a Mechanical Capacitor. The baseline power system configuration selected is a series system employing peak-power-tracking for a Low Earth-Orbiting application. Power processing, required in the motor/generator, provides a potential alternative configurations that can only be achieved in systems with electrochemical energy storage by the addition of power processing components. One such alternative configuration provides for peak-power-tracking of the solar array and still maintains a regulated bus, without the expense of additional power processing components. Precise speed control of the two counterrotating wheels is required to reduce interaction with the attitude control system (ACS) or alternatively, used to perform attitude control functions. Critical technologies identified are those pertaining to the energy storage element and are prioritized as composite wheel development, magnetic suspension, motor/generator, containment, and momentum control. Comparison with a 3-kW, 250-Vdc power system using either NiCd or NiH2 for energy storage results in a system in which inertial energy storage offers potential advantages in lifetime, operating temperature, voltage regulation, energy density, charge control, and overall system weight reduction.

  20. Statistical approach for assessing the influence of synoptic and meteorological conditions on ozone concentrations over Europe

    Science.gov (United States)

    Otero, Noelia; Butler, Tim; Sillmann, Jana

    2015-04-01

    Air pollution has become a serious problem in many industrialized and densely-populated urban areas due to its negative effects on human health, damages agricultural crops and ecosystems. The concentration of air pollutants is the result of several factors, including emission sources, lifetime and spatial distribution of the pollutants, atmospheric properties and interactions, wind speed and direction, and topographic features. Episodes of air pollution are often associated with stationary or slowly migrating anticyclonic (high-pressure) systems that reduce advection, diffusion, and deposition of atmospheric pollutants. Certain weather conditions facilitate the concentration of pollutants, such as the incidence of light winds that contributes to the increasing of stagnation episodes affecting air quality. Therefore, the atmospheric circulation plays an important role in air quality conditions that are affected by both, synoptic and local scale processes. This study assesses the influence of the large-scale circulation along with meteorological conditions on tropospheric ozone in Europe. The frequency of weather types (WTs) is examined under a novel approach, which is based on an automated version of the Lamb Weather Types catalog (Jenkinson and Collison, 1977). Here, we present an implementation of such classification point-by-point over the European domain. Moreover, the analysis uses a new grid-averaged climatology (1°x1°) of daily surface ozone concentrations from observations of individual sites that matches the resolution of global models (Schnell,et al., 2014). Daily frequency of WTs and meteorological conditions are combined in a multiple regression approach for investigating the influence on ozone concentrations. Different subsets of predictors are examined within multiple linear regression models (MLRs) for each grid cell in order to identify the best regression model. Several statistical metrics are applied for estimating the robustness of the

  1. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Science.gov (United States)

    Tonkin, Matthew J.; Tiedeman, Claire R.; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  2. Considerations on Cyber Security Assessments of Korean Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung-Woon; Song, Jae-Gu; Han, Kyung-Soo; Lee, Cheol Kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kang, Mingyun [E-Gonggam Co. Ltd., Daejeon (Korea, Republic of)

    2015-10-15

    Korea Institute of Nuclear Nonproliferation and Control (KINAC) has prepared the regulatory standard RS-015 based on RG 5.71. RS-015 defines the elements of a cyber security program to be established in nuclear facilities and describes the security control items and relevant requirements. Cyber security assessments are important initial activities in a cyber security program for NPPs. Cyber security assessments can be performed in the following key steps: 1) Formation of a cyber security assessment team (CSAT); 2) Identification of critical systems and critical digital assets (CDAs); 3) Plant compliance checks with the security control requirements in RS-015. Through the assessments, the current status of security controls applied to NPPs can be found out. The assessments provide baseline data for remedial activities. Additional analyses with the results from the assessments should be performed before the implementation of remedial security controls. The cyber security team at the Korea Atomic Energy Research Institute (KAERI) has studied how to perform cyber security assessments for NPPs based on the regulatory requirements. Recently, KAERI's cyber security team has performed pilot cyber security assessments of a Korean NPP. Based on this assessment experience, considerations and checkpoints which would be helpful for full-scale cyber security assessments of Korean NPPs and the implementation of remedial security controls are discussed in this paper. Cyber security assessment is one of important and immediate activities for NPP cyber security. The quality of the first assessment will be a barometer for NPP cyber security. Hence cyber security assessments of Korean NPPs should be performed elaborately.

  3. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    Science.gov (United States)

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  4. Improving effect size estimation and statistical power with multi-echo fMRI and its impact on understanding the neural systems supporting mentalizing.

    Science.gov (United States)

    Lombardo, Michael V; Auyeung, Bonnie; Holt, Rosemary J; Waldman, Jack; Ruigrok, Amber N V; Mooney, Natasha; Bullmore, Edward T; Baron-Cohen, Simon; Kundu, Prantik

    2016-11-15

    Functional magnetic resonance imaging (fMRI) research is routinely criticized for being statistically underpowered due to characteristically small sample sizes and much larger sample sizes are being increasingly recommended. Additionally, various sources of artifact inherent in fMRI data can have detrimental impact on effect size estimates and statistical power. Here we show how specific removal of non-BOLD artifacts can improve effect size estimation and statistical power in task-fMRI contexts, with particular application to the social-cognitive domain of mentalizing/theory of mind. Non-BOLD variability identification and removal is achieved in a biophysical and statistically principled manner by combining multi-echo fMRI acquisition and independent components analysis (ME-ICA). Without smoothing, group-level effect size estimates on two different mentalizing tasks were enhanced by ME-ICA at a median rate of 24% in regions canonically associated with mentalizing, while much more substantial boosts (40-149%) were observed in non-canonical cerebellar areas. Effect size boosting occurs via reduction of non-BOLD noise at the subject-level and consequent reductions in between-subject variance at the group-level. Smoothing can attenuate ME-ICA-related effect size improvements in certain circumstances. Power simulations demonstrate that ME-ICA-related effect size enhancements enable much higher-powered studies at traditional sample sizes. Cerebellar effects observed after applying ME-ICA may be unobservable with conventional imaging at traditional sample sizes. Thus, ME-ICA allows for principled design-agnostic non-BOLD artifact removal that can substantially improve effect size estimates and statistical power in task-fMRI contexts. ME-ICA could mitigate some issues regarding statistical power in fMRI studies and enable novel discovery of aspects of brain organization that are currently under-appreciated and not well understood.

  5. Undersampling power-law size distributions: effect on the assessment of extreme natural hazards

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.

    2014-01-01

    The effect of undersampling on estimating the size of extreme natural hazards from historical data is examined. Tests using synthetic catalogs indicate that the tail of an empirical size distribution sampled from a pure Pareto probability distribution can range from having one-to-several unusually large events to appearing depleted, relative to the parent distribution. Both of these effects are artifacts caused by limited catalog length. It is more difficult to diagnose the artificially depleted empirical distributions, since one expects that a pure Pareto distribution is physically limited in some way. Using maximum likelihood methods and the method of moments, we estimate the power-law exponent and the corner size parameter of tapered Pareto distributions for several natural hazard examples: tsunamis, floods, and earthquakes. Each of these examples has varying catalog lengths and measurement thresholds, relative to the largest event sizes. In many cases where there are only several orders of magnitude between the measurement threshold and the largest events, joint two-parameter estimation techniques are necessary to account for estimation dependence between the power-law scaling exponent and the corner size parameter. Results indicate that whereas the corner size parameter of a tapered Pareto distribution can be estimated, its upper confidence bound cannot be determined and the estimate itself is often unstable with time. Correspondingly, one cannot statistically reject a pure Pareto null hypothesis using natural hazard catalog data. Although physical limits to the hazard source size and by attenuation mechanisms from source to site constrain the maximum hazard size, historical data alone often cannot reliably determine the corner size parameter. Probabilistic assessments incorporating theoretical constraints on source size and propagation effects are preferred over deterministic assessments of extreme natural hazards based on historic data.

  6. Statistical analysis about corrosion in nuclear power plants; Analisis estadistico de la corrosion en centrales nucleares de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Naquid G, C.; Medina F, A.; Zamora R, L. [Instituto Nacional de Investigaciones Nucleares, Gerencia de Ciencia de Materiales, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  7. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    Science.gov (United States)

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-09-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective.

  8. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    Science.gov (United States)

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  9. Accuracy of genome-wide imputation of untyped markers and impacts on statistical power for association studies

    Directory of Open Access Journals (Sweden)

    McElwee Joshua

    2009-06-01

    -eQTL discoveries detected by various methods can be interpreted as their relative statistical power in the GWAS. In this study, we find that imputation offer modest additional power (by 4% on top of either Ilmn317K or Ilmn650Y, much less than the power gain from Ilmn317K to Ilmn650Y (13%. Conclusion Current algorithms can accurately impute genotypes for untyped markers, which enables researchers to pool data between studies conducted using different SNP sets. While genotyping itself results in a small error rate (e.g. 0.5%, imputing genotypes is surprisingly accurate. We found that dense marker sets (e.g. Ilmn650Y outperform sparser ones (e.g. Ilmn317K in terms of imputation yield and accuracy. We also noticed it was harder to impute genotypes for African American samples, partially due to population admixture, although using a pooled reference boosts performance. Interestingly, GWAS carried out using imputed genotypes only slightly increased power on top of assayed SNPs. The reason is likely due to adding more markers via imputation only results in modest gain in genetic coverage, but worsens the multiple testing penalties. Furthermore, cis-eQTL mapping using dense SNP set derived from imputation achieves great resolution, and locate associate peak closer to causal variants than conventional approach.

  10. A statistical assessment of pesticide pollution in surface waters using environmental monitoring data: Chlorpyrifos in Central Valley, California.

    Science.gov (United States)

    Wang, Dan; Singhasemanon, Nan; Goh, Kean S

    2016-11-15

    Pesticides are routinely monitored in surface waters and resultant data are analyzed to assess whether their uses will damage aquatic eco-systems. However, the utility of the monitoring data is limited because of the insufficiency in the temporal and spatial sampling coverage and the inability to detect and quantify trace concentrations. This study developed a novel assessment procedure that addresses those limitations by combining 1) statistical methods capable of extracting information from concentrations below changing detection limits, 2) statistical resampling techniques that account for uncertainties rooted in the non-detects and insufficient/irregular sampling coverage, and 3) multiple lines of evidence that improve confidence in the final conclusion. This procedure was demonstrated by an assessment on chlorpyrifos monitoring data in surface waters of California's Central Valley (2005-2013). We detected a significant downward trend in the concentrations, which cannot be observed by commonly-used statistical approaches. We assessed that the aquatic risk was low using a probabilistic method that works with non-detects and has the ability to differentiate indicator groups with varying sensitivity. In addition, we showed that the frequency of exceedance over ambient aquatic life water quality criteria was affected by pesticide use, precipitation and irrigation demand in certain periods anteceding the water sampling events.

  11. Regulating by the Numbers: Probabilistic Risk Assessment and Nuclear Power.

    Science.gov (United States)

    Nichols, Elizabeth; Wildavsky, Aaron

    1988-01-01

    Probabilistic risk assessment has been promoted within the Nuclear Regulatory Commission as a means of judging risk to the public and of determining regulatory measures. Interviews with engineers and other technically trained personnel reveal the difficulties created by expectations that this form of assessment should be applied. (TJH)

  12. An application and verification of ensemble forecasting on wind power to assess operational risk indicators in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, S.; Ciapessoni, E.; Cirio, D.; Pitto, A.; Sperati, S. [Ricerca sul Sistema Energetico RSE S.p.A., Milan (Italy). Power System Development Dept. and Environment and Sustainable Development Dept.; Pinson, P. [Technical University of Denmark, Lyngby (Denmark). DTU Informatics

    2012-07-01

    Wind energy is part of the so-called not schedulable renewable sources, i.e. it must be exploited when it is available, otherwise it is lost. In European regulation it has priority of dispatch over conventional generation, to maximize green energy production. However, being variable and uncertain, wind (and solar) generation raises several issues for the security of the power grids operation. In particular, Transmission System Operators (TSOs) need as accurate as possible forecasts. Nowadays a deterministic approach in wind power forecasting (WPF) could easily be considered insufficient to face the uncertainty associated to wind energy. In order to obtain information about the accuracy of a forecast and a reliable estimation of its uncertainty, probabilistic forecasting is becoming increasingly widespread. In this paper we investigate the performances of the COnsortium for Small-scale MOdelling Limited area Ensemble Prediction System (COSMO-LEPS). First the ensemble application is followed by assessment of its properties (i.e. consistency, reliability) using different verification indices and diagrams calculated on wind power. Then we provide examples of how EPS based wind power forecast can be used in power system security analyses. Quantifying the forecast uncertainty allows to determine more accurately the regulation reserve requirements, hence improving security of operation and reducing system costs. In particular, the paper also presents a probabilistic power flow (PPF) technique developed at RSE and aimed to evaluate the impact of wind power forecast accuracy on the probability of security violations in power systems. (orig.)

  13. Electric Power quality Analysis in research reactor: Impacts on nuclear safety assessment and electrical distribution reliability

    Energy Technology Data Exchange (ETDEWEB)

    Touati, Said; Chennai, Salim; Souli, Aissa [Nuclear Research Centre of Birine, Ain Oussera, Djelfa Province (Algeria)

    2015-07-01

    The increased requirements on supervision, control, and performance in modern power systems make power quality monitoring a common practise for utilities. Large databases are created and automatic processing of the data is required for fast and effective use of the available information. Aim of the work presented in this paper is the development of tools for analysis of monitoring power quality data and in particular measurements of voltage and currents in various level of electrical power distribution. The study is extended to evaluate the reliability of the electrical system in nuclear plant. Power Quality is a measure of how well a system supports reliable operation of its loads. A power disturbance or event can involve voltage, current, or frequency. Power disturbances can originate in consumer power systems, consumer loads, or the utility. The effect of power quality problems is the loss power supply leading to severe damage to equipments. So, we try to track and improve system reliability. The assessment can be focused on the study of impact of short circuits on the system, harmonics distortion, power factor improvement and effects of transient disturbances on the Electrical System during motor starting and power system fault conditions. We focus also on the review of the Electrical System design against the Nuclear Directorate Safety Assessment principles, including those extended during the last Fukushima nuclear accident. The simplified configuration of the required system can be extended from this simple scheme. To achieve these studies, we have used a demo ETAP power station software for several simulations. (authors)

  14. Assessment of CCGT power plants - Comparison between new and repowering

    Energy Technology Data Exchange (ETDEWEB)

    Giger, Francois; Lehougre, Jean-Francois [EDF/DPIT, Staint Denis (France)

    2013-04-01

    Since 2007, EDF has repowered the Martigues conventional oil-fired power plant and erected two 460 MWe combined cycle units with natural gas firing. The first unit went on line in 2012, the second is to be connected to the grid in 2013. A completely new 440 MWe combined cycle power plant was built at the Blenod site which went into operation in the autumn of 2011. Both projects were realised by EDF-in-house engineering teams. Post-building analysis confirmed the economic benefit anticipated at basic design stage of the repowering approach compared to a green field construction.

  15. Hazard Identification, Risk Assessment and Risk Control (HIRARC Accidents at Power Plant

    Directory of Open Access Journals (Sweden)

    Ahmad Asmalia Che

    2016-01-01

    Full Text Available Power plant had a reputation of being one of the most hazardous workplace environments. Workers in the power plant face many safety risks due to the nature of the job. Although power plants are safer nowadays since the industry has urged the employer to improve their employees’ safety, the employees still stumble upon many hazards thus accidents at workplace. The aim of the present study is to investigate work related accidents at power plants based on HIRARC (Hazard Identification, Risk Assessment and Risk Control process. The data were collected at two coal-fired power plant located in Malaysia. The finding of the study identified hazards and assess risk relate to accidents occurred at the power plants. The finding of the study suggested the possible control measures and corrective actions to reduce or eliminate the risk that can be used by power plant in preventing accidents from occurred

  16. Area Based Approach for Three Phase Power Quality Assessment in Clarke Plane

    Directory of Open Access Journals (Sweden)

    S. CHATTOPADHYAY

    2008-03-01

    Full Text Available This paper presents an area-based approach for electric power quality analysis. Some specific reference signals have been defined and areas formed by the real power system data with the reference signal have been calculated wherefrom contributions of fundamental waveform and harmonic components have been assessed separately. Active power, reactive power and total harmonic distortion factors have been measured. Clarke transformation technique has been used for analysis in three-phase system, which has reduced the computational effort to a great extent. Distortion factors of individual phase of a three-phase system have also been assessed.

  17. Peer Assessment Enhances Student Learning: The Results of a Matched Randomized Crossover Experiment in a College Statistics Class.

    Science.gov (United States)

    Sun, Dennis L; Harris, Naftali; Walther, Guenther; Baiocchi, Michael

    2015-01-01

    Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.

  18. Peer Assessment Enhances Student Learning: The Results of a Matched Randomized Crossover Experiment in a College Statistics Class.

    Directory of Open Access Journals (Sweden)

    Dennis L Sun

    Full Text Available Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.

  19. Radiological Assessment for the Removal of Legacy BPA Power Lines that Cross the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Millsap, William J.; Brush, Daniel J.

    2013-11-13

    This paper discusses some radiological field monitoring and assessment methods used to assess the components of an old electrical power transmission line that ran across the Hanford Site between the production reactors area (100 Area) and the chemical processing area (200 Area). This task was complicated by the presence of radon daughters -- both beta and alpha emitters -- residing on the surfaces, particularly on the surfaces of weathered metals and metals that had been electrically-charged. In many cases, these activities were high compared to the DOE Surface Contamination Guidelines, which were used as guides for the assessment. These methods included the use of the Toulmin model of argument, represented using Toulmin diagrams, to represent the combined force of several strands of evidences, rather than a single measurement of activity, to demonstrate beyond a reasonable doubt that no or very little Hanford activity was present and mixed with the natural activity. A number of forms of evidence were used: the overall chance of Hanford contamination; measurements of removable activity, beta and alpha; 1-minute scaler counts of total surface activity, beta and alpha, using "background makers"; the beta activity to alpha activity ratios; measured contamination on nearby components; NaI gamma spectral measurements to compare uncontaminated and potentially-contaminated spectra, as well as measurements for the sentinel radionuclides, Am- 241 and Cs-137 on conducting wire; comparative statistical analyses; and in-situ measurements of alpha spectra on conducting wire showing that the alpha activity was natural Po-210, as well as to compare uncontaminated and potentially-contaminated spectra.

  20. A Mixed-Methods Assessment of Using an Online Commercial Tutoring System to Teach Introductory Statistics

    Science.gov (United States)

    Xu, Yonghong Jade; Meyer, Katrina A.; Morgan, Dianne D.

    2009-01-01

    This study used a mixed-methods approach to evaluate a hybrid teaching format that incorporated an online tutoring system, ALEKS, to address students' learning needs in a graduate-level introductory statistics course. Student performance in the hybrid course with ALEKS was found to be no different from that in a course taught in a traditional…

  1. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.

  2. Fuel consumption and fire emissions estimates using Fire Radiative Power, burned area and statistical modelling on the fire event scale

    Science.gov (United States)

    Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja

    2016-04-01

    Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.

  3. Development and Application of On-line Wind Power Risk Assessment System

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Because of the large-scale integration of wind power, the dynamic characteristics of power system have many uncertain effects. Based on deterministic analysis methods, traditional on-line security assessment system cannot quantitatively estimate the actual operating conditions of the power system for only considering the most serious and credible accidents. Therefore, the risk theory is introduced into an on-line security assessment system and then an on-line risk assessment dynamic security assessment system system for wind power is designed Based on multiple data integration, and implemented by combining with the the wind power disturbance probability is available and the security assessment of the power grid can obtain security indices in different aspects. The operating risk index is an expectation of severity, computed by summing up all the products of the result probability and its severity. Analysis results are reported to the dispatchers in on-line environment, while the comprehensive weak links are automatically provided to the power dispatching center. The risk assessment system in operation can verify the reasonableness of the system.

  4. A Critical Analysis and Assessment of High Power Switches

    Science.gov (United States)

    1978-09-01

    Spark Gap Switch Assembly ........ .225 vii List of Figures (cont.) FIGURE PAGE IV-10. Turbulent Flow Switch .... ............ . 227 IV-II. Spark Gap...provide such a low inductance that the current risetime is limited by the load rather than the switch itself. Many spark gaps with liquid or solid...Sympoisum, June, 1978. [4] M. A. Lutz and G. A. Hofmann, "The Gamitron - A High Power Crossed-Field Switch Tube for HVDC Interruption," IEEE Trans. on Plasma

  5. Assessment of the thorium fuel cycle in power reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kasten, P.R.; Homan, F.J.; Allen, E.J.

    1977-01-01

    A study was conducted at Oak Ridge National Laboratory to evaluate the role of thorium fuel cycles in power reactors. Three thermal reactor systems were considered: Light Water Reactors (LWRs); High-Temperature Gas-Cooled Reactors (HTGRs); and Heavy Water Reactors (HWRs) of the Canadian Deuterium Uranium Reactor (CANDU) type; most of the effort was on these systems. A summary comparing thorium and uranium fuel cycles in Fast Breeder Reactors (FBRs) was also compiled.

  6. Dynamic Security Assessment of Danish Power System Based on Decision Trees: Today and Tomorrow

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Liu, Leo; Chen, Zhe;

    2013-01-01

    The research work presented in this paper analyzes the impact of wind energy, phasing out of central power plants and cross border power exchange on dynamic security of Danish Power System. Contingency based decision tree (DT) approach is used to assess the dynamic security of present and future...... Danish Power System. Results from offline time domain simulation for large number of possible operating conditions (OC) and critical contingencies are organized to build up the database, which is then used to predict the security of present and future power system. The mentioned approach is implemented...... in DIgSILENT PowerFactory environment and applied to western Danish Power System which is passing through a phase of major transformation. The results have shown that phasing out of central power plants coupled with large scale wind energy integration and more dependence on international ties can have...

  7. Assessment of distributed solar power systems: Issues and impacts

    Science.gov (United States)

    Moyle, R. A.; Chernoff, H.; Schweizer, T. C.; Patton, J. B.

    1982-11-01

    The installation of distributed solar-power systems presents electric utilities with a host of questions. Some of the technical and economic impacts of these systems are discussed. Among the technical interconnect issues are isolated operation, power quality, line safety, and metering options. Economic issues include user purchase criteria, structures and installation costs, marketing and product distribution costs, and interconnect costs. An interactive computer program that allows easy calculation of allowable system prices and allowable generation-equipment prices was developed as part of this project. It is concluded that the technical problems raised by distributed solar systems are surmountable, but their resolution may be costly. The stringent purchase criteria likely to be imposed by many potential system users and the economies of large-scale systems make small systems (less than 10 to 20 kW) less attractive than larger systems. Utilities that consider life-cycle costs in making investment decisions and third-party investors who have tax and financial advantages are likely to place the highest value on solar-power systems.

  8. Assessment of Microbial Fuel Cell Configurations and Power Densities

    KAUST Repository

    Logan, Bruce E.

    2015-07-30

    Different microbial electrochemical technologies are being developed for a many diverse applications, including wastewater treatment, biofuel production, water desalination, remote power sources, and as biosensors. Current and energy densities will always be limited relative to batteries and chemical fuel cells, but these technologies have other advantages based on the self-sustaining nature of the microorganisms that can donate or accept electrons from an electrode, the range of fuels that can be used, and versatility in the chemicals that can be produced. The high cost of membranes will likely limit applications of microbial electrochemical technologies that might require a membrane. For microbial fuel cells, which do not need a membrane, questions remain on whether larger-scale systems can produce power densities similar to those obtained in laboratory-scale systems. It is shown here that configuration and fuel (pure chemicals in laboratory media versus actual wastewaters) remain the key factors in power production, rather than the scale of the application. Systems must be scaled up through careful consideration of electrode spacing and packing per unit volume of reactor.

  9. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    Science.gov (United States)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  10. Hanford groundwater modeling: statistical methods for evaluating uncertainty and assessing sampling effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, D.B.

    1979-01-01

    This report is the first in a series of three documents which address the role of uncertainty in the Rockwell Hanford Operations groundwater model development and application program at Hanford Site. Groundwater data collection activities at Hanford are reviewed as they relate to Rockwell groundwater modeling. Methods of applying statistical and probability theory in quantifying the propagation of uncertainty from field measurements to model predictions are discussed. It is shown that measures of model accuracy or uncertainty provided by a statistical analysis can be useful in guiding model development and sampling network design. Recommendations are presented in the areas of model input data needs, parameter estimation data needs, and model verification and variance estimation data needs. 8 figures.

  11. Assessing climate change impacts on the Iberian power system using a coupled water-power model

    DEFF Research Database (Denmark)

    Cardenal, Silvio Javier Pereira; Madsen, Henrik; Arnbjerg-Nielsen, Karsten;

    2014-01-01

    , these impacts have not yet been evaluated at the peninsular level. We coupled a hydrological model with a power market model to study three impacts of climate change on the current Iberian power system: changes in hydropower production caused by changes in precipitation and temperature, changes in temporal......Climate change is expected to have a negative impact on the power system of the Iberian Peninsula; changes in river runoff are expected to reduce hydropower generation, while higher temperatures are expected to increase summer electricity demand, when water resources are already limited. However...... patterns of electricity demand caused by temperature changes, and changes in irrigation water use caused by temperature and precipitation changes. A stochastic dynamic programming approach was used to develop operating rules for the integrated system given hydrological uncertainty. We found that changes...

  12. Independent Orbiter Assessment (IOA): Assessment of the electrical power distribution and control subsystem, volume 1

    Science.gov (United States)

    Schmeckpeper, K. R.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA first completed an analysis of the Electrical Power Distribution and Control (EPD and C) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter EPD and C hardware. The IOA product for the EPD and C analysis consisted of 1671 failure mode analysis worksheets that resulted in 468 potential critical items being identified. Comparison was made to the proposed NASA Post 51-L baseline which consisted of FMEAs and 158 CIL items. Volume 1 contains the EPD and C subsystem description, analysis results, ground rules and assumptions, and some of the IOA worksheets.

  13. 75 FR 61779 - R.E. Ginna Nuclear Power Plant, LLC; R.E. Ginna Nuclear Power Plant Environmental Assessment and...

    Science.gov (United States)

    2010-10-06

    ... COMMISSION R.E. Ginna Nuclear Power Plant, LLC; R.E. Ginna Nuclear Power Plant Environmental Assessment and... Operating License No. DPR-18, issued to R.E. Ginna Nuclear Power Plant, LLC (the licensee), for operation of the R.E. Ginna Nuclear Power Plant (Ginna), located in Ontario, New York. In accordance with 10 CFR...

  14. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study.

    Directory of Open Access Journals (Sweden)

    Benjamin F Mentiplay

    Full Text Available Hand-held dynamometry (HHD has never previously been used to examine isometric muscle power. Rate of force development (RFD is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power.30 healthy young adults (age: 23±5 yrs, male: 15 were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2 and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC, Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change.Comparison of RFD methods revealed that a peak 200 ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70 for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70 for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31-0.79.Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in

  15. Early Mission Power Assessment of the Dawn Solar Array

    Science.gov (United States)

    Stella, Paul M.; DiStefano, Salvatore; Rayman, Marc D.; Ulloa-Severino, Antonio

    2009-01-01

    NASA's Discovery Mission Dawn was launched in September 2007. Dawn will be the first to orbit two asteroids on a single voyage. The solar array for the Dawn mission will provide power under greatly varying illumination and temperature conditions. Dawn's ion propulsion system (IPS) will provide the spacecraft with enough thrust to reach Vesta and Ceres and orbit both. The demanding mission would be impossible without ion propulsion -- a mission only to the asteroid Vesta (and not including Ceres) would require a much more massive spacecraft and, a much larger launch vehicle.

  16. Assessment of solar-powered cooling of buildings. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Curran, H.M.

    1975-04-01

    Three solar-powered cooling concepts are analyzed and evaluated. These are: (1) the solar Rankine concept in which a Rankine cycle driven by solar energy is used to drive a vapor compression refrigeration machine, (2) the solar-assisted Rankine concept in which a Rankine cycle driven by both solar energy and fuel combustion is used to drive a vapor compression refrigeration machine, and (3) the solar absorption concept in which solar energy is used to drive an absorption refrigeration machine. These concepts are compared on the bases of coefficient of performance, requirements for primary fuel input, and economic considerations. Conclusions and recommendations are presented. (WHK)

  17. Assessing China’s Soft Power in Asia: Implications for U.S. Strategy

    Science.gov (United States)

    2012-04-01

    and film …the freedom is not there. (The Chinese government) really tries to control and manage it, and there is self- censorship as well. That is...Assessing China’s Soft Power in Asia : Implications for U.S. Strategy by Colonel Michael R. Lwin United States Army...TITLE AND SUBTITLE 5a. CONTRACT NUMBER Assessing China’s Soft Power in Asia : Implications for U.S. Strategy 5b. GRANT NUMBER 5c

  18. Using Structural Equation Modeling to Assess Functional Connectivity in the Brain: Power and Sample Size Considerations

    Science.gov (United States)

    Sideridis, Georgios; Simos, Panagiotis; Papanicolaou, Andrew; Fletcher, Jack

    2014-01-01

    The present study assessed the impact of sample size on the power and fit of structural equation modeling applied to functional brain connectivity hypotheses. The data consisted of time-constrained minimum norm estimates of regional brain activity during performance of a reading task obtained with magnetoencephalography. Power analysis was first…

  19. 75 FR 42790 - Exelon Generation Company, LLC; Clinton Power Station; Environmental Assessment and Finding of No...

    Science.gov (United States)

    2010-07-22

    ... COMMISSION Exelon Generation Company, LLC; Clinton Power Station; Environmental Assessment and Finding of No...-62, issued to Exelon Generation Company, LLC (the licensee), for operation of the Clinton Power... have access to ADAMS or who encounter problems in accessing the documents located in ADAMS...

  20. 75 FR 14638 - FirstEnergy Nuclear Operating Company; Perry Nuclear Power Plant; Environmental Assessment and...

    Science.gov (United States)

    2010-03-26

    ... COMMISSION FirstEnergy Nuclear Operating Company; Perry Nuclear Power Plant; Environmental Assessment and...Energy Nuclear Operating Company (FENOC, the licensee), for operation of the Perry Nuclear Power Plant... Manager, Plant Licensing Branch III-2, Division of Operating Reactor Licensing, Office of Nuclear...

  1. ASSESSMENT OF COMBINED HEAT AND POWER SYSTEM"PREMIUM POWER" APPLICATIONS IN CALIFORNIA

    Energy Technology Data Exchange (ETDEWEB)

    Norwood, Zack; Lipman, Timothy; Stadler, Michael; Marnay, Chris

    2010-06-01

    The effectiveness of combined heat and power (CHP) systems for power interruption intolerant,"premium power," facilities is the focus of this study. Through three real-world case studies and economic cost minimization modeling, the economic and environmental performance of"premium power" CHP is analyzed. The results of the analysis for a brewery, data center, and hospital lead to some interesting conclusions about CHP limited to the specific CHP technologies installed at those sites. Firstly, facilities with high heating loads prove to be the most appropriate for CHP installations from a purely economic standpoint. Secondly, waste heat driven thermal cooling systems are only economically attractive if the technology for these chillers can increase above the current best system efficiency. Thirdly, if the reliability of CHP systems proves to be as high as diesel generators they could replace these generators at little or no additional cost if the thermal to electric (relative) load of those facilities was already high enough to economically justify a CHP system. Lastly, in terms of greenhouse gas emissions, the modeled CHP systems provide some degree of decreased emissions, estimated at approximately 10percent for the hospital, the application with the highest relative thermal load in this case

  2. Life-assessment technique for nuclear power plant cables

    Science.gov (United States)

    Bartoníček, B.; Hnát, V.; Plaček, V.

    1998-06-01

    The condition of polymer-based cable material can be best characterized by measuring elongation at break of its insulating materials. However, it is not often possible to take sufficiently large samples for measurement with the tensile testing machine. The problem has been conveniently solved by utilizing differential scanning calorimetry technique. From the tested cable, several microsamples are taken and the oxidation induction time (OIT) is determined. For each cable which is subject to the assessment of the lifetime, the correlation of OIT with elongation at break and the correlation of elongation at break with the cable service time has to be performed. A reliable assessment of the cable lifetime depends on accuracy of these correlations. Consequently, synergistic effects well known at this time - dose rate effects and effects resulting from the different sequence of applying radiation and elevated temperature must be taken into account.

  3. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    CERN Document Server

    Choquet, É; Soummer, R; Perrin, M D; Hagan, J B; Gofas-Salas, E; Rajan, A; Aguilar, J

    2015-01-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  4. Statistical challenges in assessing potential efficacy of complex interventions in pilot or feasibility studies.

    Science.gov (United States)

    Wilson, Duncan T; Walwyn, Rebecca Ea; Brown, Julia; Farrin, Amanda J; Brown, Sarah R

    2016-06-01

    Early phase trials of complex interventions currently focus on assessing the feasibility of a large randomised control trial and on conducting pilot work. Assessing the efficacy of the proposed intervention is generally discouraged, due to concerns of underpowered hypothesis testing. In contrast, early assessment of efficacy is common for drug therapies, where phase II trials are often used as a screening mechanism to identify promising treatments. In this paper, we outline the challenges encountered in extending ideas developed in the phase II drug trial literature to the complex intervention setting. The prevalence of multiple endpoints and clustering of outcome data are identified as important considerations, having implications for timely and robust determination of optimal trial design parameters. The potential for Bayesian methods to help to identify robust trial designs and optimal decision rules is also explored.

  5. Assessment of factors responsible for polymer electrolyte membrane fuel cell electrode performance by statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Velayutham, G.; Dhathathreyan, K.S.; Rajalakshmi, N. [Centre For Fuel Cell Technology, Project of ARC International, 120, Mambakkam Main Road, Medavakkam, Chennai 600 100 (India); Sampangi Raman, D. [Indian Statistical Institute, Nelson Manickam Road, Chennai 600 029 (India)

    2009-06-01

    The performance of the fuel cell electrode depends on many factors: types of materials and their properties, composition, process parameters and fuel cell operation conditions. In the present paper, cathode electrode performance in a PEM fuel cell as a function of Teflon concentration in the substrate materials and in micro-layer carbon, pore former in the micro-layer, amount of carbon used in the diffusion layer and Platinum and Nafion loading in the catalyst layer are studied. These six factors each at two levels are considered. A full factorial design would have required 2{sup 6}, i.e., 64 experiments to be carried out. With the use of Taguchi method, L{sub 12} designs, the number of experiments can be reduced to 12. The electrode current density values are taken as responses for the analysis. Statistical sensitivity analysis (ANOVA analysis) is used to compute the effects and the contributions of the various factors to the fuel cell electrode. Some graphic representations are employed in order to display the results of the statistical analysis made for different current values. The behavior of cathode PEM fuel cell electrode was studied using humidified hydrogen and compressed air. The present paper examines the six main factors and their levels responsible for altering the performance particularly when the fuel is operated under ambient pressure. (author)

  6. On the Assessment of Monte Carlo Error in Simulation-Based Statistical Analyses.

    Science.gov (United States)

    Koehler, Elizabeth; Brown, Elizabeth; Haneuse, Sebastien J-P A

    2009-05-01

    Statistical experiments, more commonly referred to as Monte Carlo or simulation studies, are used to study the behavior of statistical methods and measures under controlled situations. Whereas recent computing and methodological advances have permitted increased efficiency in the simulation process, known as variance reduction, such experiments remain limited by their finite nature and hence are subject to uncertainty; when a simulation is run more than once, different results are obtained. However, virtually no emphasis has been placed on reporting the uncertainty, referred to here as Monte Carlo error, associated with simulation results in the published literature, or on justifying the number of replications used. These deserve broader consideration. Here we present a series of simple and practical methods for estimating Monte Carlo error as well as determining the number of replications required to achieve a desired level of accuracy. The issues and methods are demonstrated with two simple examples, one evaluating operating characteristics of the maximum likelihood estimator for the parameters in logistic regression and the other in the context of using the bootstrap to obtain 95% confidence intervals. The results suggest that in many settings, Monte Carlo error may be more substantial than traditionally thought.

  7. A statistical and experimental approach for assessing the preservation of plant lipids in soil

    Science.gov (United States)

    Mueller, K. E.; Eissenstat, D. M.; Oleksyn, J.; Freeman, K. H.

    2011-12-01

    Plant-derived lipids contribute to stable soil organic matter, but further interpretations of their abundance in soils are limited because the factors that control lipid preservation are poorly understood. Using data from a long-term field experiment and simple statistical models, we provide novel constraints on several predictors of the concentration of hydrolyzable lipids in forest mineral soils. Focal lipids included common monomers of cutin, suberin, and plant waxes present in tree leaves and roots. Soil lipid concentrations were most strongly influenced by the concentrations of lipids in leaves and roots of the overlying trees, but were also affected by the type of lipid (e.g. alcohols vs. acids), lipid chain length, and whether lipids originated in leaves or roots. Collectively, these factors explained ~80% of the variation in soil lipid concentrations beneath 11 different tree species. In order to use soil lipid analyses to test and improve conceptual models of soil organic matter stabilization, additional studies that provide experimental and quantitative (i.e. statistical) constraints on plant lipid preservation are needed.

  8. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Cabalin, L.M.; Gonzalez, A. [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain); Ruiz, J. [Department of Applied Physics I, University of Malaga, E-29071 Malaga (Spain); Laserna, J.J., E-mail: laserna@uma.e [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain)

    2010-08-15

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s{sup -1}. Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  9. Power and related statistical properties of conditional likelihood score tests for association studies in nuclear families with parental genotypes.

    Science.gov (United States)

    Li, Z; Gastwirth, J L; Gail, M H

    2005-05-01

    Both population based and family based case control studies are used to test whether particular genotypes are associated with disease. While population based studies have more power, cryptic population stratification can produce false-positive results. Family-based methods have been introduced to control for this problem. This paper presents the full likelihood function for family-based association studies for nuclear families ascertained on the basis of their number of affected and unaffected children. The likelihood of a family factors into the probability of parental mating type, conditional on offspring phenotypes, times the probability of offspring genotypes given their phenotypes and the parental mating type. The first factor can be influenced by population stratification, whereas the latter factor, called the conditional likelihood, is not. The conditional likelihood is used to obtain score tests with proper size in the presence of population stratification (see also Clayton (1999) and Whittemore & Tu (2000)). Under either the additive or multiplicative model, the TDT is known to be the optimal score test when the family has only one affected child. Thus, the class of score tests explored can be considered as a general family of TDT-like procedures. The relative informativeness of the various mating types is assessed using the Fisher information, which depends on the number of affected and unaffected offspring and the penetrances. When the additive model is true, families with parental mating type Aa x Aa are most informative. Under the dominant (recessive) model, however, a family with mating type Aa x aa(AA x Aa) is more informative than a family with doubly heterozygous (Aa x Aa) parents. Because we derive explicit formulae for all components of the likelihood, we are able to present tables giving required sample sizes for dominant, additive and recessive inheritance models.

  10. An Interactive Assessment Framework for Visual Engagement: Statistical Analysis of a TEDx Video

    Science.gov (United States)

    Farhan, Muhammad; Aslam, Muhammad

    2017-01-01

    This study aims to assess the visual engagement of the video lectures. This analysis can be useful for the presenter and student to find out the overall visual attention of the videos. For this purpose, a new algorithm and data collection module are developed. Videos can be transformed into a dataset with the help of data collection module. The…

  11. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment

    NARCIS (Netherlands)

    Henriksson, P.J.G.; Rico Artero, A.; Zhang, W.; Nahid, S.S.A.; Newton, R.; Phan, L.T.; Zhang, Z.

    2015-01-01

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different aq

  12. Statistical Classification for Cognitive Diagnostic Assessment: An Artificial Neural Network Approach

    Science.gov (United States)

    Cui, Ying; Gierl, Mark; Guo, Qi

    2016-01-01

    The purpose of the current investigation was to describe how the artificial neural networks (ANNs) can be used to interpret student performance on cognitive diagnostic assessments (CDAs) and evaluate the performances of ANNs using simulation results. CDAs are designed to measure student performance on problem-solving tasks and provide useful…

  13. Statistical Quality Assessment of Pre-fried Carrots Using Multispectral Imaging

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara; Clemmensen, Line Katrine Harder; Løje, Hanne

    2013-01-01

    Multispectral imaging is increasingly being used for quality assessment of food items due to its non-invasive benefits. In this paper, we investigate the use of multispectral images of pre-fried carrots, to detect changes over a period of 14 days. The idea is to distinguish changes in quality fro...

  14. Computation of Steady State Nodal Voltages for Fast Security Assessment in Power Systems

    DEFF Research Database (Denmark)

    Møller, Jakob Glarbo; Jóhannsson, Hjörtur; Østergaard, Jacob

    2014-01-01

    Development of a method for real-time assess-ment of post-contingency nodal voltages is introduced. Linear network theory is applied in an algorithm that utilizes Thevenin equivalent representation of power systems as seen from every voltage-controlled node in a network. The method is evaluated b...

  15. Meanings and Practices of Power in Academics' Conceptions of Student Self-Assessment

    Science.gov (United States)

    Tan, Kelvin H. K.

    2009-01-01

    Recent publications and research have warned that student self-assessment practices in higher education cannot be presumed to empower students in ways that enhances their learning. This is partly due to a tendency to speak of power in student self-assessment in general and undefined terms. Hence, there is a need to identify the types of power…

  16. Statistical assessment of soil surface roughness for environmental applications using photogrammetric imaging techniques

    Science.gov (United States)

    Marzahn, Philip; Rieke-Zapp, Dirk; Ludwig, Ralf

    2010-05-01

    Micro scale soil surface roughness is a crucial parameter in many environmental applications. Recent soil erosion studies have shown the impact of micro topography on soil erosion rates as well as overland flow generation due to soil crusting effects. Besides the above mentioned, it is widely recognized that the backscattered signal in SAR remote sensing is strongly influenced by soil surface roughness and by regular higher order tillage patterns. However, there is an ambiguity in the appropriate measurement technique and scale for roughness studies and SAR backscatter model parametrization. While different roughness indices depend on their measurement length, no satisfying roughness parametrization and measurement technique has been found yet, introducing large uncertainty in the interpretation of the radar backscatter. In the presented study, we computed high resolution digital elevation models (DEM) using a consumer grade digital camera in the frame of photogrammetric imaging techniques to represent soil micro topography from different soil surfaces (ploughed, harrowed, seedbed and crusted) . The retrieved DEMs showed sufficient accuracy, with an RMSE of a 1.64 mm compared to high accurate reference points,. For roughness characterization, we calculated different roughness indices (RMS height (s), autocorrelation length (l), tortuosity index (TB)). In an extensive statistical investigation we show the behaviour of the roughness indices for different acquisition sizes. Compared to results from profile measurements taken from literature and profiles generated out of the dataset, results indicate,that by using a three dimensional measuring device, the calculated roughness indices are more robust against outliers and even saturate faster with increasing acquisition size. Dependent on the roughness condition, the calculated values for the RMS-height saturate for ploughed fields at 2.3 m, for harrowed fields at 2.0 m and for crusted fields at 1.2 m. Results also

  17. Water treatment plants assessment at Talkha power plant.

    Science.gov (United States)

    El-Sebaie, Olfat D; Abd El-Kerim, Ghazy E; Ramadan, Mohamed H; Abd El-Atey, Magda M; Taha, Sahr Ahmed

    2002-01-01

    Talkha power plant is the only power plant located in El-Mansoura. It generates electricity using two different methods by steam turbine and gas turbine. Both plants drew water from River Nile (208 m3 /h). The Nile raw water passes through different treatment processes to be suitable for drinking and operational uses. At Talkha power plant, there are two purification plants used for drinking water supply (100 m3/h) and for water demineralization supply (108 m3/h). This study aimed at studying the efficiency of the water purification plants. For drinking water purification plant, the annual River Nile water characterized by slightly alkaline pH (7.4-8), high annual mean values of turbidity (10.06 NTU), Standard Plate Count (SPC) (313.3 CFU/1 ml), total coliform (2717/100 ml), fecal coliform (0-2400/100 ml), and total algae (3 x 10(4) org/I). The dominant group of algae all over the study period was green algae. The blue green algae was abundant in Summer and Autumn seasons. The pH range, and the annual mean values of turbidity, TDS, total hardness, sulfates, chlorides, nitrates, nitrites, fluoride, and residual chlorine for purified water were in compliance with Egyptian drinking water standards. All the SPC recorded values with an annual mean value of 10.13 CFU/1 ml indicated that chlorine dose and contact time were not enough to kill the bacteria. However, they were in compliance with Egyptian decree (should not exceed 50 CFU/1 ml). Although the removal efficiency of the plant for total coliform and blue green algae was high (98.5% and 99.2%, respectively), the limits of the obtained results with an annual mean values of 40/100 ml and 15.6 org/l were not in compliance with the Egyptian decree (should be free from total coliform, fecal coliform and blue green algae). For water demineralization treatment plant, the raw water was characterized by slightly alkaline pH. The annual mean values of conductivity, turbidity, and TDS were 354.6 microS/cm, 10.84 NTU, and 214

  18. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    Science.gov (United States)

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  19. Multivariate Statistical Analysis: a tool for groundwater quality assessment in the hidrogeologic region of the Ring of Cenotes, Yucatan, Mexico.

    Science.gov (United States)

    Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.

    2014-12-01

    The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.

  20. Gasification/combined-cycle power generation: environmental assessment of alternative systems

    Energy Technology Data Exchange (ETDEWEB)

    1978-11-01

    This report provides a basis for the comparative assessment of the potential performance capability, technological development, and economic and environmental impact associated with the operation of integrated low-Btu coal-gasification/combined-cycle power systems. Characterization of the integrated power system in terms of fuel processing, power production, and auxiliary systems is followed up with comparisons of alternative integrated-plant-design/fuel combinations with reference to the conventional coal-fired power plant, taking into account both economic and environmental factors. The report includes an assessment of the effects of recent regulatory changes on the prospects for integrated power systems and establishes a timetable for the probable commercial development of such systems by the utilities.

  1. Assessment of the cogeneration biogas plant possibilities in the autonomous power supply system

    Directory of Open Access Journals (Sweden)

    Sumarokova Liudmila

    2017-01-01

    Full Text Available The use of biomass and wood waste for heat and power production is increasing from year to year. Waste wood is low carbon footprint, has low sulfur content and relates to renewable energy sources. The paper demonstrates the possibility of increasing the energy efficiency of power supply system of the Stepanovka settlement (Tomsk region by means of replacing diesel power plant (DPP by the biofuel gas piston CHP. The assessment was based on the possibility of the technical and economic comparison of power supply options in the settlement.

  2. Extraction and use of historical extreme climate databases for nuclear power plants safety assessment

    Science.gov (United States)

    Hamdi, Yasser; Bertin, Xavier; Bardet, Lise; Duluc, Claire-Marie; Rebour, Vincent

    2015-04-01

    Safety assessments of nuclear power plants (NPPs) related to natural hazards are a matter of major interest to the nuclear community in France and many European countries. Over the past fewer decades, France has experienced many of these events such as heat waves (2003 and 2006), heavy snowstorms (1958, 1990 and 1992), storms which have given rise to heavy rain and severe floods (1992, 1999, 2010), strong straight-line wind and extreme marine surges (1987, 1999 and 2010) much larger than the other local observations (outliers). These outliers had clearly illustrated the potential to underestimate the extreme surges calculated with the current statistical methods. The estimation of extreme surges then requires the use of a statistical analysis approach having a more solid theoretical framework and using more reliable databases for the assessment of hazards to design NPPs to low or extremely low probabilities of failure. These databases can be produced by collecting historical information (HI) about severe climatic events occurred over short and long timescales. As a matter of fact, natural hazards such as heat waves, droughts, floods, severe storms and snowstorms have affected France and many European countries since the dawn of time. These events would have been such horrific experiences that if they really occurred, there would be unmistakable traces of them. They must have left clues. These catastrophic events have been unforgettably engraved in people's minds and many of them have been traced in archives and history textbooks. The oldest events have certainly left clues and traces somewhere in the geological layers of the earth or elsewhere. The construction of the historical databases and developing probabilistic approaches capable of integrating them correctly is highly challenging for the scientific community (Translating these geological clues to historical data to build historical databases that can be used by the statistical models is a different

  3. Spatial scan statistics to assess sampling strategy of antimicrobial resistance monitoring programme

    DEFF Research Database (Denmark)

    Vieira, Antonio; Houe, Hans; Wegener, Henrik Caspar

    2009-01-01

    collected and Susceptibility tested pig samples in Denmark between 2002 and 2006. For the yearly analysis, both high and low sampling rates areas were significant, with two clusters in 2002 (relative risk [RR]: 2.91, p ...Pie collection and analysis of data on antimicrobial resistance in human and animal Populations are important for establishing a baseline of the occurrence of resistance and for determining trends over time. In animals, targeted monitoring with a stratified sampling plan is normally used. However......, to our knowledge it has not previously been analyzed whether animals have a random chance of being sampled by these programs, regardless of their spatial distribution. In this study, we used spatial scan statistics, based on a Poisson model, as a tool to evaluate the geographical distribution of animals...

  4. Statistic-mathematical interpretation of some assessment parameters of the grassland ecosystem according to soil characteristics

    Science.gov (United States)

    Samfira, Ionel; Boldea, Marius; Popescu, Cosmin

    2012-09-01

    Significant parameters of permanent grasslands are represented by the pastoral value and Shannon and Simpson biodiversity indices. The dynamics of these parameters has been studied in several plant associations in Banat Plain, Romania. From the point of view of their typology, these permanent grasslands belong to the steppe area, series Festuca pseudovina, type Festuca pseudovina-Achilea millefolium, subtype Lolium perenne. The methods used for the purpose of this research included plant cover analysis (double meter method, calculation of Shannon and Simpson indices), and statistical methods of regression and correlation. The results show that, in the permanent grasslands in the plain region, when the pastoral value is average to low, the level of interspecific biodiversity is on the increase.

  5. The Good, the Bad and the Ugly: Statistical quality assessment of SZ detections

    CERN Document Server

    Aghanim, N; Diego, J -M; Douspis, M; Macias-Perez, J; Pointecouteau, E; Comis, B; Arnaud, M; Montier, L

    2014-01-01

    We examine three approaches to the problem of source classification in catalogues. Our goal is to determine the confidence with which the elements in these catalogues can be distinguished in populations on the basis of their spectral energy distribution (SED). Our analysis is based on the projection of the measurements onto a comprehensive SED model of the main signals in the considered range of frequencies. We first first consider likelihood analysis, which half way between supervised and unsupervised methods. Next, we investigate an unsupervised clustering technique. Finally, we consider a supervised classifier based on Artificial Neural Networks. We illustrate the approach and results using catalogues from various surveys. i.e., X-Rays (MCXC), optical (SDSS) and millimetric (Planck Sunyaev-Zeldovich (SZ)). We show that the results from the statistical classifications of the three methods are in very good agreement with each others, although the supervised neural network-based classification shows better pe...

  6. A statistical method for assessing peptide identification confidence in accurate mass and time tag proteomics.

    Science.gov (United States)

    Stanley, Jeffrey R; Adkins, Joshua N; Slysz, Gordon W; Monroe, Matthew E; Purvine, Samuel O; Karpievitch, Yuliya V; Anderson, Gordon A; Smith, Richard D; Dabney, Alan R

    2011-08-15

    Current algorithms for quantifying peptide identification confidence in the accurate mass and time (AMT) tag approach assume that the AMT tags themselves have been correctly identified. However, there is uncertainty in the identification of AMT tags, because this is based on matching LC-MS/MS fragmentation spectra to peptide sequences. In this paper, we incorporate confidence measures for the AMT tag identifications into the calculation of probabilities for correct matches to an AMT tag database, resulting in a more accurate overall measure of identification confidence for the AMT tag approach. The method is referenced as Statistical Tools for AMT Tag Confidence (STAC). STAC additionally provides a uniqueness probability (UP) to help distinguish between multiple matches to an AMT tag and a method to calculate an overall false discovery rate (FDR). STAC is freely available for download, as both a command line and a Windows graphical application.

  7. Multivariate statistical assessment of heavy metal pollution sources of groundwater around a lead and zinc plant

    Directory of Open Access Journals (Sweden)

    Zamani Abbas Ali

    2012-12-01

    Full Text Available Abstract The contamination of groundwater by heavy metal ions around a lead and zinc plant has been studied. As a case study groundwater contamination in Bonab Industrial Estate (Zanjan-Iran for iron, cobalt, nickel, copper, zinc, cadmium and lead content was investigated using differential pulse polarography (DPP. Although, cobalt, copper and zinc were found correspondingly in 47.8%, 100.0%, and 100.0% of the samples, they did not contain these metals above their maximum contaminant levels (MCLs. Cadmium was detected in 65.2% of the samples and 17.4% of them were polluted by this metal. All samples contained detectable levels of lead and iron with 8.7% and 13.0% of the samples higher than their MCLs. Nickel was also found in 78.3% of the samples, out of which 8.7% were polluted. In general, the results revealed the contamination of groundwater sources in the studied zone. The higher health risks are related to lead, nickel, and cadmium ions. Multivariate statistical techniques were applied for interpreting the experimental data and giving a description for the sources. The data analysis showed correlations and similarities between investigated heavy metals and helps to classify these ion groups. Cluster analysis identified five clusters among the studied heavy metals. Cluster 1 consisted of Pb, Cu, and cluster 3 included Cd, Fe; also each of the elements Zn, Co and Ni was located in groups with single member. The same results were obtained by factor analysis. Statistical investigations revealed that anthropogenic factors and notably lead and zinc plant and pedo-geochemical pollution sources are influencing water quality in the studied area.

  8. Multivariate statistical assessment of heavy metal pollution sources of groundwater around a lead and zinc plant.

    Science.gov (United States)

    Zamani, Abbas Ali; Yaftian, Mohammad Reza; Parizanganeh, Abdolhossein

    2012-12-17

    The contamination of groundwater by heavy metal ions around a lead and zinc plant has been studied. As a case study groundwater contamination in Bonab Industrial Estate (Zanjan-Iran) for iron, cobalt, nickel, copper, zinc, cadmium and lead content was investigated using differential pulse polarography (DPP). Although, cobalt, copper and zinc were found correspondingly in 47.8%, 100.0%, and 100.0% of the samples, they did not contain these metals above their maximum contaminant levels (MCLs). Cadmium was detected in 65.2% of the samples and 17.4% of them were polluted by this metal. All samples contained detectable levels of lead and iron with 8.7% and 13.0% of the samples higher than their MCLs. Nickel was also found in 78.3% of the samples, out of which 8.7% were polluted. In general, the results revealed the contamination of groundwater sources in the studied zone. The higher health risks are related to lead, nickel, and cadmium ions. Multivariate statistical techniques were applied for interpreting the experimental data and giving a description for the sources. The data analysis showed correlations and similarities between investigated heavy metals and helps to classify these ion groups. Cluster analysis identified five clusters among the studied heavy metals. Cluster 1 consisted of Pb, Cu, and cluster 3 included Cd, Fe; also each of the elements Zn, Co and Ni was located in groups with single member. The same results were obtained by factor analysis. Statistical investigations revealed that anthropogenic factors and notably lead and zinc plant and pedo-geochemical pollution sources are influencing water quality in the studied area.

  9. Assessing artificial neural networks and statistical methods for infilling missing soil moisture records

    Science.gov (United States)

    Dumedah, Gift; Walker, Jeffrey P.; Chik, Li

    2014-07-01

    Soil moisture information is critically important for water management operations including flood forecasting, drought monitoring, and groundwater recharge estimation. While an accurate and continuous record of soil moisture is required for these applications, the available soil moisture data, in practice, is typically fraught with missing values. There are a wide range of methods available to infilling hydrologic variables, but a thorough inter-comparison between statistical methods and artificial neural networks has not been made. This study examines 5 statistical methods including monthly averages, weighted Pearson correlation coefficient, a method based on temporal stability of soil moisture, and a weighted merging of the three methods, together with a method based on the concept of rough sets. Additionally, 9 artificial neural networks are examined, broadly categorized into feedforward, dynamic, and radial basis networks. These 14 infilling methods were used to estimate missing soil moisture records and subsequently validated against known values for 13 soil moisture monitoring stations for three different soil layer depths in the Yanco region in southeast Australia. The evaluation results show that the top three highest performing methods are the nonlinear autoregressive neural network, rough sets method, and monthly replacement. A high estimation accuracy (root mean square error (RMSE) of about 0.03 m/m) was found in the nonlinear autoregressive network, due to its regression based dynamic network which allows feedback connections through discrete-time estimation. An equally high accuracy (0.05 m/m RMSE) in the rough sets procedure illustrates the important role of temporal persistence of soil moisture, with the capability to account for different soil moisture conditions.

  10. Present-day and future mediterranean precipitation extremes assessed by different statistical approaches

    Science.gov (United States)

    Paxian, A.; Hertig, E.; Seubert, S.; Vogt, G.; Jacobeit, J.; Paeth, H.

    2015-02-01

    The Mediterranean area is strongly vulnerable to future changes in temperature and precipitation, particularly concerning extreme events, and has been identified as a climate change hot spot. This study performs a comprehensive investigation of present-day and future Mediterranean precipitation extremes based on station data, gridded observations and simulations of the regional climate model (REMO) driven by the coupled global general circulation model ECHAM5/MPI-OM. Extreme value estimates from different statistical methods—quantile-based indices, generalized pareto distribution (GPD) based return values and data from a weather generator—are compared and evaluated. Dynamical downscaling reveals improved small-scale topographic structures and more realistic higher rainfall totals and extremes over mountain ranges and in summer. REMO tends to overestimate gridded observational data in winter but is closer to local station information. The dynamical-statistical weather generator provides virtual station rainfall from gridded REMO data that overcomes typical discrepancies between area-averaged model rainfall and local station information, e.g. overestimated numbers of rainy days and underestimated extreme intensities. Concerning future rainfall amount, strong summer and winter drying over the northern and southern Mediterranean, respectively, is confronted with winter wetting over the northern part. In contrast, precipitation extremes tend to increase in even more Mediterranean areas, implying regions with decreasing totals but intensifying extremes, e.g. southern Europe and Turkey in winter and the Balkans in summer. The GPD based return values reveal slightly larger regions of increasing rainfall extremes than quantile-based indices, and the virtual stations from the weather generator show even stronger increases.

  11. A network-based method to assess the statistical significance of mild co-regulation effects.

    Directory of Open Access Journals (Sweden)

    Emőke-Ágnes Horvát

    Full Text Available Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis.

  12. Demonstration of Recessed Downlight Technologies: Power and Illumination Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Steven A.; Beeson, Tracy A.

    2009-11-20

    Solid state lighting (SSL), specifically light-emitting diodes (LED), has been advancing at a rapid pace, and there are presently multiple products available that serve as direct replacements for traditional luminaires. In this demonstration, conventional recessed lights in a conference room were used to compare conventional incandescent A-lamps, incandescent reflector R-lamps, dimming compact fluorescent lamps (CFL), to an LED replacement product. The primary focus during the study was on light delivered to the task plane as provided by the power required by the lighting system. Vertical illuminance, dimming range, and color shift are also important indicators of lighting quality and are discussed in the report. The results clearly showed that LEDs, with dimming-capable drivers, are much more efficient than incandescent and CFLs. Further, LEDs provide much smoother and consistent dimming than dimmable CFLs. On the potential negative side, it is important that the dimming switch be identified as compatible with the LED driver. A wide variety of dimmer switches are capable of dimming LEDs down to 15% of full light output, while select others can be capable of dimming LEDs down to 5%. In addition, LEDs can be intensive light sources, which can result in uncomfortable glare in some applications and to some occupants. Higher ceiling (9-foot or greater) or non-specular reflectors can act to alleviate the potential for glare.

  13. Independent Orbiter Assessment (IOA): Analysis of the electrical power distribution and control/electrical power generation subsystem

    Science.gov (United States)

    Patton, Jeff A.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C)/Electrical Power Generation (EPG) hardware. The EPD and C/EPG hardware is required for performing critical functions of cryogenic reactant storage, electrical power generation and product water distribution in the Orbiter. Specifically, the EPD and C/EPG hardware consists of the following components: Power Section Assembly (PSA); Reactant Control Subsystem (RCS); Thermal Control Subsystem (TCS); Water Removal Subsystem (WRS); and Power Reactant Storage and Distribution System (PRSDS). The IOA analysis process utilized available EPD and C/EPG hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  14. Application of Devices and Systems Designed for Power Quality Monitoring and Assessment

    Directory of Open Access Journals (Sweden)

    Wiesław Gil

    2014-03-01

    Full Text Available The paper presents the problems associated with increasing demands on the equipment and systems for power quality assessment (PQ, installed at power substations. Difficulties are signaled due to current lack of standards defining the test methodology of measuring devices. The necessary device properties and the structure of a large system operated in real time and designed to assess the PQ are discussed. The usefulness of multi-channel analyzers featuring the identification and registration of transients is pointed out. The desirability of synchrophasor assessment implementation and device integration by standard PN-EN 61850 with other SAS devices is also justified.

  15. The influence of control group reproduction on the statistical power of the Environmental Protection Agency's Medaka Extended One Generation Reproduction Test (MEOGRT).

    Science.gov (United States)

    Flynn, Kevin; Swintek, Joe; Johnson, Rodney

    2017-02-01

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of medaka breeding pairs. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) would have on the statistical power of the test. The MEOGRT Reproduction Power Analysis Tool (MRPAT) is a software tool developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. Example scenarios are detailed that highlight the importance of the reproductive parameters on statistical power. When control fecundity is increased from 21 to 38 eggs per pair per day and the variance decreased from 49 to 20, the gain in power is equivalent to increasing replication by 2.5 times. On the other hand, if 10% of the breeding pairs, including controls, do not spawn, the power to detect a 40% decrease in fecundity drops to 0.54 from nearly 0.98 when all pairs have some level of egg production. Perhaps most importantly, MRPAT was used to inform the decision making process that lead to the final recommendation of the MEOGRT to have 24 control breeding pairs and 12 breeding pairs in each exposure group.

  16. AN ASSESSMENT OF FLYWHEEL HIGH POWER ENERGY STORAGE TECHNOLOGY FOR HYBRID VEHICLES

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, James Gerald [ORNL

    2012-02-01

    An assessment has been conducted for the DOE Vehicle Technologies Program to determine the state of the art of advanced flywheel high power energy storage systems to meet hybrid vehicle needs for high power energy storage and energy/power management. Flywheel systems can be implemented with either an electrical or a mechanical powertrain. The assessment elaborates upon flywheel rotor design issues of stress, materials and aspect ratio. Twelve organizations that produce flywheel systems submitted specifications for flywheel energy storage systems to meet minimum energy and power requirements for both light-duty and heavy-duty hybrid applications of interest to DOE. The most extensive experience operating flywheel high power energy storage systems in heavy-duty and light-duty hybrid vehicles is in Europe. Recent advances in Europe in a number of vehicle racing venues and also in road car advanced evaluations are discussed. As a frame of reference, nominal weight and specific power for non-energy storage components of Toyota hybrid electric vehicles are summarized. The most effective utilization of flywheels is in providing high power while providing just enough energy storage to accomplish the power assist mission effectively. Flywheels are shown to meet or exceed the USABC power related goals (discharge power, regenerative power, specific power, power density, weight and volume) for HEV and EV batteries and ultracapacitors. The greatest technical challenge facing the developer of vehicular flywheel systems remains the issue of safety and containment. Flywheel safety issues must be addressed during the design and testing phases to ensure that production flywheel systems can be operated with adequately low risk.

  17. Surface water quality assessment by the use of combination of multivariate statistical classification and expert information.

    Science.gov (United States)

    Tobiszewski, M; Tsakovski, S; Simeonov, V; Namieśnik, J

    2010-08-01

    The present study deals with the assessment of surface water quality from an industrial-urban region located in northern Poland near to the city of Gdansk. Concentrations of thirteen chemicals including total polycyclic aromatic hydrocarbons (PAHs), halogenated volatile organic compounds (HVOCs) and major ions in the samples collected at five sampling points during six campaigns were used as variables throughout the study. The originality in the monitoring data treatment and interpretation was the combination of a traditional classification approach (self-organizing maps of Kohonen) with PAH diagnostic ratios expertise to achieve a reliable pollution source identification. Thus, sampling points affected by pollution from traffic (petroleum combustion products), from crude oil processing (petroleum release related compounds), and from phosphogypsum disposal site were properly discriminated. Additionally, it is shown that this original assessment approach can be useful in finding specific pollution source tracers.

  18. Local homogeneity combined with DCT statistics to blind noisy image quality assessment

    Science.gov (United States)

    Yang, Lingxian; Chen, Li; Chen, Heping

    2015-03-01

    In this paper a novel method for blind noisy image quality assessment is proposed. First, it is believed that human visual system (HVS) is more sensitive to the local smoothness area in a noise image, an adaptively local homogeneous block selection algorithm is proposed to construct a new homogeneous image named as homogeneity blocks (HB) based on computing each pixel characteristic. Second, applying the discrete cosine transform (DCT) for each HB and using high frequency component to evaluate image noise level. Finally, a modified peak signal to noise ratio (MPSNR) image quality assessment approach is proposed based on analysis DCT kurtosis distributions change and noise level above-mentioned. Simulations show that the quality scores that produced from the proposed algorithm are well correlated with the human perception of quality and also have a stability performance.

  19. Risk assessment of the extreme interplanetary shock of 23 July 2012 on low-latitude power networks

    Science.gov (United States)

    Zhang, J. J.; Wang, C.; Sun, T. R.; Liu, Y. D.

    2016-03-01

    Geomagnetic sudden commencements (SCs), characterized by a rapid enhancement in the rate of change of the geomagnetic field perturbation (dB/dt), are considered to be an important source of large geomagnetically induced currents (GICs) in middle- and low-latitude power grids. In this study, the extreme interplanetary shock of 23 July 2012 is simulated under the assumption that it had hit the Earth with the result indicating the shock-caused SC would be 123 nT. Based on statistics, the occurrence frequency of SCs with amplitudes larger than the simulated one is estimated to be approximately 0.2% during the past 147 years on the Earth. During this extreme event, the simulation indicates that dB/dt, which is usually used as a proxy for GICs, at a dayside low-latitude substation would exceed 100 nT/min; this is very large for low-latitude regions. We then assess the GIC threat level based on the simulated geomagnetic perturbations by using the method proposed by Marshall et al. (2011). The results indicate that the risk remains at "low" level for the low-latitude power network on a global perspective. However, the GIC risk may reach "moderate" or even "high" levels for some equatorial power networks due to the influence of the equatorial electrojet. Results of this study feature substantial implications for risk management, planning, and design of low-latitude electric power networks.

  20. Narratives from parents in England and Norway: Power and emotions in child protection assessments

    Directory of Open Access Journals (Sweden)

    Vibeke Samsonsen

    2015-07-01

    Full Text Available The framework for assessment in child protection, as well as the context of the welfare state, differs between England and Norway. Assessments in England are structured in terms of a set model (the triangle and procedures to be followed, whereas in Norway there are few national guidelines and not a set model for assessments. This underpins professional judgement as the most important component in Norway. This is a study of parents` experiences from assessment in these two contexts, and patterns and themes of assessment experiences have been identified in the two countries through a narrative analysis of in-depth interviews with parents. When asked about their opinions of the current assessment framework, parents in both countries talk more about feelings than about framework and procedures, as their experiences of assessment are similar in both countries. First and foremost, they experience strong emotions in a stressful situation, including anxiety, frustration and powerlessness, but also relief. These cross-national emotions might provide information about how assessment is a stressful situation for the parents involved. However, we find some differences in the way social work is acted out according to the national assessment framework and policy context. In England, the framework and procedures seem to provide clarity with regard to process and power within the system. In Norway, the assessment is characterized by a professional judgement accompanied by more resources, which we find enables helpful decisions from a family perspective. However, this heavy reliance on relationships using professional judgement might also be viewed as a source of informal power. These findings are discussed in relation to theories of emotions and the concept of power. Regarding implications for practice, we would recommend a more explicit awareness of help and control in assessment among social workers involved, together with a clear communication on the topic

  1. Statistical Assessment of the Impact of Elevated Contents of Cu and Ni on the Properties of Austempered Ductile Iron

    Directory of Open Access Journals (Sweden)

    Nawrocki P.

    2016-12-01

    Full Text Available The article presents a statistical analysis of data collected from the observation of the production of austempered ductile iron. The impact assessment of the chemical composition, i.e. high contents of Cu and Ni on the properties of ductile iron isothermal tempered is critical to find the right chemical composition of austempered ductile iron. Based on the analyses range of the percentage of Cu and Ni which were selected in the cast iron to obtain material with high strength properties.

  2. Exergy and Environmental Impact Assessment between Solar Powered Gas Turbine and Conventional Gas Turbine Power Plant

    Directory of Open Access Journals (Sweden)

    Ali Rajaei

    2016-01-01

    Full Text Available Recuperator is a heat exchanger that is used in gas turbine power plants to recover energy from outlet hot gases to heat up the air entering the combustion chamber. Similarly, the combustion chamber inlet air can be heated up to temperatures up to 1000 (°C by solar power tower (SPT as a renewable and environmentally benign energy source. In this study, comprehensive comparison between these two systems in terms of energy, exergy, and environmental impacts is carried out. Thermodynamic simulation of both cycles is conducted using a developed program in MATLAB environment. Exergetic performances of both cycles and their emissions are compared and parametric study is carried out. A new parameter (renewable factor is proposed to evaluate resources quality and measure how green an exergy loss or destruction or a system as a whole is. Nonrenewable exergy destruction and loss are reduced compared to GT with recuperator cycle by 34.89% and 47.41%, respectively. Reductions in CO2, NOx, and CO compared to GT with recuperator cycle by 49.92%, 66.14%, and 39.77%, respectively, are in line with renewable factor value of around 55.7 which proves the ability of the proposed green measure to evaluate and compare the cycles performances.

  3. Using integrated multivariate statistics to assess the hydrochemistry of surface water quality, Lake Taihu basin, China

    Directory of Open Access Journals (Sweden)

    Xiangyu Mu

    2014-09-01

    Full Text Available Natural factors and anthropogenic activities both contribute dissolved chemical loads to  lakes and streams.  Mineral solubility,  geomorphology of the drainage basin, source strengths and climate all contribute to concentrations and their variability. Urbanization and agriculture waste-water particularly lead to aquatic environmental degradation. Major contaminant sources and controls on water quality can be asssessed by analyzing the variability in proportions of major and minor solutes in water coupled to mutivariate statistical methods.   The demand for freshwater needed for increasing crop production puulation and industrialization occurs almost everywhere in in China and these conflicting needs have led to widespread water contamination. Because of heavy nutrient loadings from all of these sources, Lake Taihu (eastern China notably suffers periodic hyper-eutrophication and drinking water deterioration, which has led to shortages of freshwater for the City of Wuxi and other nearby cities. This lake, the third largest freshwater body in China, has historically beeen considered a cultural treasure of China, and has supported long-term fisheries. The is increasing pressure to remediate the present contamination which compromises both aquiculture and the prior economic base centered on tourism.  However, remediation cannot be effectively done without first characterizing the broad nature of the non-point source pollution. To this end, we investigated the hydrochemical setting of Lake Taihu to determine how different land use types influence the variability of surface water chemistry in different water sources to the lake. We found that waters broadly show wide variability ranging from  calcium-magnesium-bicarbonate hydrochemical facies type to mixed sodium-sulfate-chloride type. Principal components analysis produced three principal components that explained 78% of the variance in the water quality and reflect three major types of water

  4. Statistical downscaling of the French Mediterranean climate: assessment for present and projection in an anthropogenic scenario

    Directory of Open Access Journals (Sweden)

    C. Lavaysse

    2012-03-01

    Full Text Available The Mediterranean basin is a particularly vulnerable region to climate change, featuring a sharply contrasted climate between the North and South and governed by a semi-enclosed sea with pronounced surrounding topography covering parts of the Europe, Africa and Asia regions. The physiographic specificities contribute to produce mesoscale atmospheric features that can evolve to high-impact weather systems such as heavy precipitation, wind storms, heat waves and droughts. The evolution of these meteorological extremes in the context of global warming is still an open question, partly because of the large uncertainty associated with existing estimates produced by global climate models (GCM with coarse horizontal resolution (~200 km. Downscaling climatic information at a local scale is, thus, needed to improve the climate extreme prediction and to provide relevant information for vulnerability and adaptation studies. In this study, we investigate wind, temperature and precipitation distributions for past recent climate and future scenarios at eight meteorological stations in the French Mediterranean region using one statistical downscaling model, referred as the "Cumulative Distribution Function transform" (CDF-t approach. A thorough analysis of the uncertainty associated with statistical downscaling and bi-linear interpolation of large-scale wind speed, temperature and rainfall from reanalyses (ERA-40 and three GCM historical simulations, has been conducted and quantified in terms of Kolmogorov-Smirnov scores. CDF-t produces a more accurate and reliable local wind speed, temperature and rainfall. Generally, wind speed, temperature and rainfall CDF obtained with CDF-t are significantly similar with the observed CDF, even though CDF-t performance may vary from one station to another due to the sensitivity of the driving large-scale fields or local impact. CDF-t has then been applied to climate simulations of the 21st century under B1 and A2 scenarios

  5. Assessment of the Economic Potential of Microgrids for Reactive Power Supply

    Energy Technology Data Exchange (ETDEWEB)

    Appen, Jan von; Marnay, Chris; Stadler, Michael; Momber, Ilan; Klapp, David; Scheven, Alexander von

    2011-05-01

    As power generation from variable distributed energy resources (DER) grows, energy flows in the network are changing, increasing the requirements for ancillary services, including voltage support. With the appropriate power converter, DER can provide ancillary services such as frequency control and voltage support. This paper outlines the economic potential of DERs coordinated in a microgrid to provide reactive power and voltage support at its point of common coupling. The DER Customer Adoption Model assesses the costs of providing reactive power, given local utility rules. Depending on the installed DER, the cost minimizing solution for supplying reactive power locally is chosen. Costs include the variable cost of the additional losses and the investment cost of appropriately over-sizing converters or purchasing capacitors. A case study of a large health care building in San Francisco is used to evaluate different revenue possibilities of creating an incentive for microgrids to provide reactive power.

  6. Aging assessment of large electric motors in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Villaran, M.; Subudhi, M. [Brookhaven National Lab., Upton, NY (United States)

    1996-03-01

    Large electric motors serve as the prime movers to drive high capacity pumps, fans, compressors, and generators in a variety of nuclear plant systems. This study examined the stressors that cause degradation and aging in large electric motors operating in various plant locations and environments. The operating history of these machines in nuclear plant service was studied by review and analysis of failure reports in the NPRDS and LER databases. This was supplemented by a review of motor designs, and their nuclear and balance of plant applications, in order to characterize the failure mechanisms that cause degradation, aging, and failure in large electric motors. A generic failure modes and effects analysis for large squirrel cage induction motors was performed to identify the degradation and aging mechanisms affecting various components of these large motors, the failure modes that result, and their effects upon the function of the motor. The effects of large motor failures upon the systems in which they are operating, and on the plant as a whole, were analyzed from failure reports in the databases. The effectiveness of the industry`s large motor maintenance programs was assessed based upon the failure reports in the databases and reviews of plant maintenance procedures and programs.

  7. Statistical assessment of DNA extraction reagent lot variability in real-time quantitative PCR

    Science.gov (United States)

    Bushon, R.N.; Kephart, C.M.; Koltun, G.F.; Francy, D.S.; Schaefer, F. W.; Lindquist, H.D. Alan

    2010-01-01

    Aims: The aim of this study was to evaluate the variability in lots of a DNA extraction kit using real-time PCR assays for Bacillus anthracis, Francisella tularensis and Vibrio cholerae. Methods and Results: Replicate aliquots of three bacteria were processed in duplicate with three different lots of a commercial DNA extraction kit. This experiment was repeated in triplicate. Results showed that cycle threshold values were statistically different among the different lots. Conclusions: Differences in DNA extraction reagent lots were found to be a significant source of variability for qPCR results. Steps should be taken to ensure the quality and consistency of reagents. Minimally, we propose that standard curves should be constructed for each new lot of extraction reagents, so that lot-to-lot variation is accounted for in data interpretation. Significance and Impact of the Study: This study highlights the importance of evaluating variability in DNA extraction procedures, especially when different reagent lots are used. Consideration of this variability in data interpretation should be an integral part of studies investigating environmental samples with unknown concentrations of organisms. ?? 2010 The Society for Applied Microbiology.

  8. Assessment of drug disposition in the perfused rat brain by statistical moment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sakane, T.; Nakatsu, M.; Yamamoto, A.; Hashida, M.; Sezaki, H.; Yamashita, S.; Nadai, T. (Faculty of Pharmaceutical Sciences, Setsunan University, Osaka (Japan))

    1991-06-01

    Drug disposition in the brain was investigated by statistical moment analysis using an improved in situ brain perfusion technique. The right cerebral hemisphere of the rat was perfused in situ. The drug and inulin were injected into the right internal carotid artery as a rapid bolus and the venous outflow curve at the posterior facial vein was obtained. The infusion rate was adjusted to minimize the flow of perfusion fluid into the left hemisphere. The obtained disposition parameters were characteristics and considered to reflect the physicochemical properties of each drug. Antipyrine showed a small degree of initial uptake. Therefore, its apparent distribution volume (Vi) and apparent intrinsic clearance (CLint,i) were small. Diazepam showed large degrees of both influx and efflux and, thus, a large Vi. Water showed parameters intermediate between those of antipyrine and those of diazepam. Imipramine, desipramine, and propranolol showed a large CLint,i compared with those of the other drugs. The extraction ratio of propranolol significantly decreased with increasing concentrations of unlabeled propranolol in the perfusion fluid. These findings may be explained partly by the tissue binding of these drugs. In conclusion, the present method is useful for studying drug disposition in the brain.

  9. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  10. Statistical Assessment of Shapes and Magnetic Field Orientations in Molecular Clouds through Polarization Observations

    CERN Document Server

    Tassis, K; Hildebrand, R H; Kirby, L; Vaillancourt, J E

    2009-01-01

    We present a novel statistical analysis aimed at deriving the intrinsic shapes and magnetic field orientations of molecular clouds using dust emission and polarization observations by the Hertz polarimeter. Our observables are the aspect ratio of the projected plane-of-the-sky cloud image, and the angle between the mean direction of the plane-of-the-sky component of the magnetic field and the short axis of the cloud image. To overcome projection effects due to the unknown orientation of the line-of-sight, we combine observations from 24 clouds, assuming that line-of-sight orientations are random and all are equally probable. Through a weighted least-squares analysis, we find that the best-fit intrinsic cloud shape describing our sample is an oblate disk with only small degrees of triaxiality. The best-fit intrinsic magnetic field orientation is close to the direction of the shortest cloud axis, with small (~24 deg) deviations toward the long/middle cloud axes. However, due to the small number of observed clou...

  11. Hydrochemical assessment of Semarang area using multivariate statistics: A sample based dataset

    OpenAIRE

    Irawan, Dasapta Erwin; Putranto, Thomas Triadi

    2016-01-01

    The following paper describes in brief the data set related to our project "Hydrochemical assessment of Semarang Groundwater Quality". All of 58 samples were taken in 1992, 1993, 2003, 2006, and 2007 using well point data from several reports from Ministry of Energy and Min- eral Resources and independent consultants. We provided 20 parameters in each samples (sample id, coord X, coord Y, well depth, water level, water elevation, TDS, pH, EC, K, Ca, Na, Mg, Cl, SO4, HCO3, ye...

  12. Assessment of uncertainty in full core reactor physics calculations using statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    McEwan, C., E-mail: mcewac2@mcmaster.ca [McMaster Univ., Hamilton, Ontario (Canada)

    2012-07-01

    The best estimate method of safety analysis involves choosing a realistic set of input parameters for a proposed safety case and evaluating the uncertainty in the results. Determining the uncertainty in code outputs remains a challenge and is the subject of a benchmarking exercise proposed by the Organization for Economic Cooperation and Development. The work proposed in this paper will contribute to this benchmark by assessing the uncertainty in a depletion calculation of the final nuclide concentrations for an experiment performed in the Fukushima-2 reactor. This will be done using lattice transport code DRAGON and a tool known as DINOSAUR. (author)

  13. Final Report: Assessment of Combined Heat and Power Premium Power Applications in California

    Energy Technology Data Exchange (ETDEWEB)

    Norwood, Zack; Lipman, Tim; Marnay, Chris; Kammen, Dan

    2008-09-30

    This report analyzes the current economic and environmental performance of combined heat and power (CHP) systems in power interruption intolerant commercial facilities. Through a series of three case studies, key trade-offs are analyzed with regard to the provision of black-out ridethrough capability with the CHP systems and the resutling ability to avoid the need for at least some diesel backup generator capacity located at the case study sites. Each of the selected sites currently have a CHP or combined heating, cooling, and power (CCHP) system in addition to diesel backup generators. In all cases the CHP/CCHP system have a small fraction of the electrical capacity of the diesel generators. Although none of the selected sites currently have the ability to run the CHP systems as emergency backup power, all could be retrofitted to provide this blackout ride-through capability, and new CHP systems can be installed with this capability. The following three sites/systems were used for this analysis: (1) Sierra Nevada Brewery - Using 1MW of installed Molten Carbonate Fuel Cells operating on a combination of digestor gas (from the beer brewing process) and natural gas, this facility can produce electricty and heat for the brewery and attached bottling plant. The major thermal load on-site is to keep the brewing tanks at appropriate temperatures. (2) NetApp Data Center - Using 1.125 MW of Hess Microgen natural gas fired reciprocating engine-generators, with exhaust gas and jacket water heat recovery attached to over 300 tons of of adsorption chillers, this combined cooling and power system provides electricity and cooling to a data center with a 1,200 kW peak electrical load. (3) Kaiser Permanente Hayward Hospital - With 180kW of Tecogen natural gas fired reciprocating engine-generators this CHP system generates steam for space heating, and hot water for a city hospital. For all sites, similar assumptions are made about the economic and technological constraints of the

  14. Determining the Suitability of Two Different Statistical Techniques in Shallow Landslide (Debris Flow Initiation Susceptibility Assessment in the Western Ghats

    Directory of Open Access Journals (Sweden)

    M. V. Ninu Krishnan

    2015-01-01

    Full Text Available In the present study, the Information Value (InfoVal and the Multiple Logistic Regression (MLR methods based on bivariate and multivariate statistical analysis have been applied for shallow landslide initiation susceptibility assessment in a selected subwatershed in the Western Ghats, Kerala, India, to determine the suitability of geographical information systems (GIS assisted statistical landslide susceptibility assessment methods in the data constrained regions. The different landslide conditioning terrain variables considered in the analysis are geomorphology, land use/land cover, soil thickness, slope, aspect, relative relief, plan curvature, profile curvature, drainage density, the distance from drainages, lineament density and distance from lineaments. Landslide Susceptibility Index (LSI maps were produced by integrating the weighted themes and divided into five landslide susceptibility zones (LSZ by correlating the LSI with general terrain conditions. The predictive performances of the models were evaluated through success and prediction rate curves. The area under success rate curves (AUC for InfoVal and MLR generated susceptibility maps shows 84.11% and 68.65%, respectively. The prediction rate curves show good to moderate correlation between the distribution of the validation group of landslides and LSZ maps with AUC values of 0.648 and 0.826 respectively for MLR and InfoVal produced LSZ maps. Considering the best fit and suitability of the models in the study area by quantitative prediction accuracy, LSZ map produced by the InfoVal technique shows higher accuracy, i.e. 82.60%, than the MLR model and is more realistic while compared in the field and is considered as the best suited model for the assessment of landslide susceptibility in areas similar to the study area. The LSZ map produced for the area can be utilised for regional planning and assessment process, by incorporating the generalised rainfall conditions in the area. DOI

  15. A statistical assessment of the impact of land uses on surface water quality indexes.

    Science.gov (United States)

    Seeboonruang, Uma

    2012-06-30

    The release of wastewater from various land uses is threatening the quality of surface water. Different land uses pose varying degrees of danger to water resources. The hazardous extent of each activity depends on the amount and characteristics of the wastewater. The concept of the contamination potential index (CPI) of an activity is introduced and applied here. The index depends on the quantity of wastewater from a single source and on various chemicals in the waste whose concentrations are above allowable standards. The CPI concept and the land use impact assessment are applied to the surface water conditions in Nakhon Nayok Province in the central region of Thailand. The land uses considered in this study are residential area, industrial zone, in-season and off-season rice farming, and swine and poultry livestock. Multiple linear regression analysis determines the impact of the CPIs of these land uses on certain water quality characteristics, i.e., total dissolved solids, electrical conductivity, phosphate, and chloride concentrations, using CPIs and previous water quality measurements. The models are further verified according to the current CPIs and measured concentrations. The results of the backward and forward modeling show that the land uses that affect water quality are off-season rice farming, raising poultry, and residential activity. They demonstrate that total dissolved solids and conductivity are reasonable parameters to apply in the land use assessment.

  16. Perinatal Health Statistics as the Basis for Perinatal Quality Assessment in Croatia

    Directory of Open Access Journals (Sweden)

    Urelija Rodin

    2015-01-01

    Full Text Available Context. Perinatal mortality indicators are considered the most important measures of perinatal outcome. The indicators reliability depends on births and deaths reporting and recording. Many publications focus on perinatal deaths underreporting and misclassification, disabling proper international comparisons. Objective. Description of perinatal health care quality assessment key indicators in Croatia. Methods. Retrospective review of reports from all maternities from 2001 to 2014. Results. According to reporting criteria for birth weight ≥500 g, perinatal mortality (PNM was reduced by 31%, fetal mortality (FM by 32%, and early neonatal mortality (ENM by 29%. According to reporting criteria for ≥1000 g, PNM was reduced by 43%, FM by 36%, and ENM by 54%. PNM in ≥22 weeks’ (wks gestational age (GA was reduced by 28%, FM by 30%, and ENM by 26%. The proportion of FM at 32–36 wks GA and at term was the highest between all GA subgroups, as opposed to ENM with the highest proportion in 22–27 wks GA. Through the period, the maternal mortality ratio varied from 2.4 to 14.3/100,000 live births. The process indicators have been increased in number by more than half since 2001, the caesarean deliveries from 11.9% in 2001 to 19.6% in 2014. Conclusions. The comprehensive perinatal health monitoring represents the basis for the perinatal quality assessment.

  17. Assessing soil quality indicator under different land use and soil erosion using multivariate statistical techniques.

    Science.gov (United States)

    Nosrati, Kazem

    2013-04-01

    Soil degradation associated with soil erosion and land use is a critical problem in Iran and there is little or insufficient scientific information in assessing soil quality indicator. In this study, factor analysis (FA) and discriminant analysis (DA) were used to identify the most sensitive indicators of soil quality for evaluating land use and soil erosion within the Hiv catchment in Iran and subsequently compare soil quality assessment using expert opinion based on soil surface factors (SSF) form of Bureau of Land Management (BLM) method. Therefore, 19 soil physical, chemical, and biochemical properties were measured from 56 different sampling sites covering three land use/soil erosion categories (rangeland/surface erosion, orchard/surface erosion, and rangeland/stream bank erosion). FA identified four factors that explained for 82 % of the variation in soil properties. Three factors showed significant differences among the three land use/soil erosion categories. The results indicated that based upon backward-mode DA, dehydrogenase, silt, and manganese allowed more than 80 % of the samples to be correctly assigned to their land use and erosional status. Canonical scores of discriminant functions were significantly correlated to the six soil surface indices derived of BLM method. Stepwise linear regression revealed that soil surface indices: soil movement, surface litter, pedestalling, and sum of SSF were also positively related to the dehydrogenase and silt. This suggests that dehydrogenase and silt are most sensitive to land use and soil erosion.

  18. A statistical study of the spatial distribution of Co-operative UK Twin Located Auroral Sounding System (CUTLASS) backscatter power during EISCAT heater beam-sweeping experiments

    Science.gov (United States)

    Shergill, H.; Robinson, T. R.; Dhillon, R. S.; Lester, M.; Milan, S. E.; Yeoman, T. K.

    2010-05-01

    High-power electromagnetic waves can excite a variety of plasma instabilities in Earth's ionosphere. These lead to the growth of plasma waves and plasma density irregularities within the heated volume, including patches of small-scale field-aligned electron density irregularities. This paper reports a statistical study of intensity distributions in patches of these irregularities excited by the European Incoherent Scatter (EISCAT) heater during beam-sweeping experiments. The irregularities were detected by the Co-operative UK Twin Located Auroral Sounding System (CUTLASS) coherent scatter radar located in Finland. During these experiments the heater beam direction is steadily changed from northward to southward pointing. Comparisons are made between statistical parameters of CUTLASS backscatter power distributions and modeled heater beam power distributions provided by the EZNEC version 4 software. In general, good agreement between the statistical parameters and the modeled beam is observed, clearly indicating the direct causal connection between the heater beam and the irregularities, despite the sometimes seemingly unpredictable nature of unaveraged results. The results also give compelling evidence in support of the upper hybrid theory of irregularity excitation.

  19. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques.

    Science.gov (United States)

    Abbas Alkarkhi, F M; Ismail, Norli; Easa, Azhar Mat

    2008-02-11

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.

  20. Statistical Bias Correction scheme for climate change impact assessment at a basin scale

    Science.gov (United States)

    Nyunt, C. T.

    2013-12-01

    Global climate models (GCMs) are the primary tool for understanding how the global climate may change in the future. GCM precipitation is characterized by underestimation of heavy precipitation, frequency errors by low intensity with long drizzle rain days and fail to catch the inter-seasonal change compared to the ground data. This study focus on the basin scale climate change impact study and we proposed the method for the multi model (GCMs) selection method together with the statistical bias correction method which cover the major deficiencies of GCM biases for climate change impact study at the basin level. The proposed method had been tested its applicability in the various river basin under different climate such as semiarid region in Tunisia, tropical monsoonal climate in Philippines and temperate humid region in Japan. It performed well enough for the climate change impact study in the basin scale and it can catch the point scale and basin scale climatology precipitation very well during the historical simulation. We found the GCM simulation during baiu season dissipate the baiu activity more earlier than the actual one when compared to the in-situ station data in Japan. For that case, the proposed bias correction performed in each season to reduce the bias of GCM for the impact study. The proposed bias correction method is still tested in different river basin in the world to check it applicability and now under developing as the web interface as the handy and efficient tool for the end users from the different parts of the world.

  1. Quantitative assessments of burn degree by high-frequency ultrasonic backscattering and statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Yi-Hsun; Wang, Shyh-Hau [Department of Computer Science and Information Engineering, and Institute of Medical Informatics, National Cheng Kung University, No 1, University Road, Tainan City 70101, Taiwan (China); Huang, Chih-Chung, E-mail: shyhhau@mail.ncku.edu.tw [Department of Electrical Engineering, Fu Jen Catholic University, 510, Chung Cheng Rd, Hsin Chuang, Taipei County 24205, Taiwan (China)

    2011-02-07

    An accurate and quantitative modality to assess the burn degree is crucial for determining further treatments to be properly applied to burn injury patients. Ultrasounds with frequencies higher than 20 MHz have been applied to dermatological diagnosis due to its high resolution and noninvasive capability. Yet, it is still lacking a substantial means to sensitively correlate the burn degree and ultrasonic measurements quantitatively. Thus, a 50 MHz ultrasound system was developed and implemented to measure ultrasonic signals backscattered from the burned skin tissues. Various burn degrees were achieved by placing a 100 deg. C brass plate onto the dorsal skins of anesthetized rats for various durations ranged from 5 to 20 s. The burn degrees were correlated with ultrasonic parameters, including integrated backscatter (IB) and Nakagami parameter (m) calculated from ultrasonic signals acquired from the burned tissues of a 5 x 1.4 mm (width x depth) area. Results demonstrated that both IB and m decreased exponentially with the increase of burn degree. Specifically, an IB of -79.0 {+-} 2.4 (mean {+-} standard deviation) dB for normal skin tissues tended to decrease to -94.0 {+-} 1.3 dB for those burned for 20 s, while the corresponding Nakagami parameters tended to decrease from 0.76 {+-} 0.08 to 0.45 {+-} 0.04. The variation of both IB and m was partially associated with the change of properties of collagen fibers from the burned tissues verified by samples of tissue histological sections. Particularly, the m parameter may be more sensitive to differentiate burned skin due to the fact that it has a greater rate of change with respect to different burn durations. These ultrasonic parameters in conjunction with high-frequency B-mode and Nakagami images could have the potential to assess the burn degree quantitatively.

  2. Assessment of landslide risk using gis and statistical methods in kysuce region

    Directory of Open Access Journals (Sweden)

    Barančoková Mária

    2014-03-01

    Full Text Available The landslide susceptibility was assessed based on multivariation analysis. The input parameters were represented by lithology, land use, slope inclination and average annual precipitation. These parameters were evaluated as independent variables, and the existing landslides as dependent variables. The individual input parameters were reclassified and spatially adjusted. Spatial analysis resulted in 15 988 combinations of input parameters representing the homogeneous condition unit (HCU . Based on the landslide density within individual units, the HCU polygons have been classified according to landslide risk into stable, conditionally stable, conditionally stable and unstable (subdivided into low, medium and high landslide risk. A total of 2002 HCU s were affected by landslides, and the remaining 13 986 were not affected. The total HCU area affected by landslides is about 156.92 km2 (20.1%. Stable areas covered 623.01 km2 (79.8%, and conditionally stable areas covered 228.77 km2 (29.33% out of this area. Unstable areas were divided into three levels of landslide risk - low, medium and high risk. An area of 111.19 km2 (14.3% represents low landslide risk, medium risk 29.7 km2 (3.8% and 16.01 km2 (2% represents high risk. Since Zlín Formation lithological unit covers approximately one-third of the study area, it also influences the overall landslide risk assessment. This lithological formation covers the largest area within all landslide risk classes as well as in conditionally stable areas. The most frequent slope class was in the range of 14-19. The higher susceptibility of Zlín Formation to landslides is caused mainly by different geomorphological value of claystone and sandstone sequence. The higher share of claystone results in higher susceptibility of this formation to exogenous degradation processes.

  3. Groundwater quality assessment using geospatial and statistical tools in Salem District, Tamil Nadu, India

    Science.gov (United States)

    Arulbalaji, P.; Gurugnanam, B.

    2016-11-01

    The water quality study of Salem district, Tamil Nadu has been carried out to assess the water quality for domestic and irrigation purposes. For this purpose, 59 groundwater samples were collected and analyzed for pH, electrical conductivity (EC), total dissolved solids (TDS), major anions (HCO3 -, CO3 -, F-, Cl-, NO2 - + NO3 -, and SO4 2-), major cations (Ca2+ Mg2+, Na+, and K+), alkalinity (ALK), and hardness (HAR). To assess the water quality, the following chemical parameters were calculated based on the analytical results, such as Piper plot, water quality index (WQI), sodium adsorption ratio (SAR), magnesium hazard (MH), Kelly index (KI), and residual sodium carbonate (RSC). Wilcox diagram represents that 23% of the samples are excellent to good, 40% of the samples are good to permissible, 10% of the samples are permissible to doubtful, 24% of the samples are doubtful unsuitable, and only 3% of the samples are unsuitable for irrigation. SAR values shows that 52% of the samples indicate high-to-very high and low-to-medium alkali water. KI values indicate good quality (30%) and not suitable (70%) for irrigation purposes. RSC values indicate that 89% of samples are suitable for irrigation purposes. MH reveals that 17% suitable and 83% samples are not suitable for irrigation purposes and for domestic purposes the excellent (8%), good (48%), and poor (44%). The agricultural waste, fertilizer used, soil leaching, urban runoff, livestock waste, and sewages are the sources of poor water quality. Some samples are not suitable for irrigation purposes due to high salinity, hardness, and magnesium concentration. In general, the groundwater of the Salem district was polluted by agricultural activities, anthropogenic activities, ion exchange, and weathering.

  4. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    Science.gov (United States)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component

  5. Establishing normative data for repeated cognitive assessment: a comparison of different statistical methods.

    Science.gov (United States)

    Van der Elst, Wim; Molenberghs, Geert; Van Boxtel, Martin P J; Jolles, Jelle

    2013-12-01

    Serial cognitive assessment is conducted to monitor changes in the cognitive abilities of patients over time. At present, mainly the regression-based change and the ANCOVA approaches are used to establish normative data for serial cognitive assessment. These methods are straightforward, but they have some severe drawbacks. For example, they can only consider the data of two measurement occasions. In this article, we propose three alternative normative methods that are not hampered by these problems-that is, multivariate regression, the standard linear mixed model (LMM), and the linear mixed model combined with multiple imputation (LMM with MI) approaches. The multivariate regression method is primarily useful when a small number of repeated measurements are taken at fixed time points. When the data are more unbalanced, the standard LMM and the LMM with MI methods are more appropriate because they allow for a more adequate modeling of the covariance structure. The standard LMM has the advantage that it is easier to conduct and that it does not require a Monte Carlo component. The LMM with MI, on the other hand, has the advantage that it can flexibly deal with missing responses and missing covariate values at the same time. The different normative methods are illustrated on the basis of the data of a large longitudinal study in which a cognitive test (the Stroop Color Word Test) was administered at four measurement occasions (i.e., at baseline and 3, 6, and 12 years later). The results are discussed and suggestions for future research are provided.

  6. The european flood alert system EFAS – Part 2: Statistical skill assessment of probabilistic and deterministic operational forecasts

    Directory of Open Access Journals (Sweden)

    J. C. Bartholmes

    2009-02-01

    Full Text Available Since 2005 the European Flood Alert System (EFAS has been producing probabilistic hydrological forecasts in pre-operational mode at the Joint Research Centre (JRC of the European Commission. EFAS aims at increasing preparedness for floods in trans-national European river basins by providing medium-range deterministic and probabilistic flood forecasting information, from 3 to 10 days in advance, to national hydro-meteorological services.

    This paper is Part 2 of a study presenting the development and skill assessment of EFAS. In Part 1, the scientific approach adopted in the development of the system has been presented, as well as its basic principles and forecast products. In the present article, two years of existing operational EFAS forecasts are statistically assessed and the skill of EFAS forecasts is analysed with several skill scores. The analysis is based on the comparison of threshold exceedances between proxy-observed and forecasted discharges. Skill is assessed both with and without taking into account the persistence of the forecasted signal during consecutive forecasts.

    Skill assessment approaches are mostly adopted from meteorology and the analysis also compares probabilistic and deterministic aspects of EFAS. Furthermore, the utility of different skill scores is discussed and their strengths and shortcomings illustrated. The analysis shows the benefit of incorporating past forecasts in the probability analysis, for medium-range forecasts, which effectively increases the skill of the forecasts.

  7. Assessment of Economic Efficiency Pertaining to Application of Energy Storage Units in Power System

    Directory of Open Access Journals (Sweden)

    A. Chernetsky

    2013-01-01

    Full Text Available The paper considers some aspects pertaining to an application of technologies for energy storage in electric power. Review of technical and cost characteristics of energy storage units has been given in the paper. The review reflects data of the energy storage units which are available and which are under development. The paper proposes an approach that permits to assess boundaries of economically reasonable application of energy storage systems in order to balance daily load curve of a power system.

  8. Non-equilibrium statistical field theory for classical particles: Linear and mildly non-linear evolution of cosmological density power spectra

    CERN Document Server

    Bartelmann, Matthias; Berg, Daniel; Kozlikin, Elena; Lilow, Robert; Viermann, Celia

    2014-01-01

    We use the non-equlibrium statistical field theory for classical particles, recently developed by Mazenko and Das and Mazenko, together with the free generating functional we have previously derived for point sets initially correlated in phase space, to calculate the time evolution of power spectra in the free theory, i.e. neglecting particle interactions. We provide expressions taking linear and quadratic momentum correlations into account. Up to this point, the expressions are general with respect to the free propagator of the microscopic degrees of freedom. We then specialise the propagator to that expected for particles in cosmology treated within the Zel'dovich approximation and show that, to linear order in the momentum correlations, the linear growth of the cosmological power spectrum is reproduced. Quadratic momentum correlations return a first contribution to the non-linear evolution of the power spectrum, for which we derive a simple closed expression valid for arbitrary wave numbers. This expressio...

  9. Long-term assessment of natural attenuation: statistical approach on soils with aged PAH contamination.

    Science.gov (United States)

    Ouvrard, Stéphanie; Chenot, Elodie-Denise; Masfaraud, Jean-François; Schwartz, Christophe

    2013-07-01

    Natural attenuation processes valorization for PAH-contaminated soil remediation has gained increasing interest from site owners. A misunderstanding of this method and a small amount of data available does not encourage its development. However, monitored natural attenuation (MNA) offers a valuable, cheaper and environmentally friendly alternative to more classical options such as physico-chemical treatments (e.g., chemical oxidation, thermal desorption). The present work proposes the results obtained during a long-term natural attenuation assessment of historically contaminated industrial soils under real climatic conditions. This study was performed after a 10 year natural attenuation period on 60 off-ground lysimeters filled with contaminated soils from different former industrial sites (coking industry, manufactured gas plants) whose initial concentration of PAH varied between 380 and 2,077 mg kg(-1). The analysed parameters included leached water characterization, soil PAH concentrations, evaluation of vegetation cover quality and quantity. Results showed a good efficiency of the PAH dissipation and limited transfer of contaminants to the environment. It also highlighted the importance of the fine soil fractions in controlling PAH reactivity. PAH dissipation through water leaching was limited and did not present a significant risk for the environment. This PAH water concentration appeared however as a good indicator of overall dissipation rate, thereby illustrating the importance of pollutant availability in predicting its degradation potential.

  10. Damage assessment for wind turbine blades based on a multivariate statistical approach

    Science.gov (United States)

    García, David; Tcherniak, Dmitri; Trendafilova, Irina

    2015-07-01

    This paper presents a vibration based structural health monitoring methodology for damage assessment on wind turbine blades made of composite laminates. Normally, wind turbine blades are manufactured by two half shells made by composite laminates which are glued together. This connection must be carefully controlled due to its high probability to disbond which might result in collapse of the whole structure. The delamination between both parts must be monitored not only for detection but also for localisation and severity determination. This investigation consists in a real time monitoring methodology which is based on singular spectrum analysis (SSA) for damage and delamination detection. SSA is able to decompose the vibratory response in a certain number of components based on their covariance distribution. These components, known as Principal Components (PCs), contain information about of the oscillatory patterns of the vibratory response. The PCs are used to create a new space where the data can be projected for better visualization and interpretation. The method suggested is applied herein for a wind turbine blade where the free-vibration responses were recorded and processed by the methodology. Damage for different scenarios viz different sizes and locations was introduced on the blade. The results demonstrate a clear damage detection and localization for all damage scenarios and for the different sizes.

  11. Integration of HIV in the Human Genome: Which Sites Are Preferential? A Genetic and Statistical Assessment

    Science.gov (United States)

    Gonçalves, Juliana; Moreira, Elsa; Sequeira, Inês J.; Rodrigues, António S.; Rueff, José; Brás, Aldina

    2016-01-01

    Chromosomal fragile sites (FSs) are loci where gaps and breaks may occur and are preferential integration targets for some viruses, for example, Hepatitis B, Epstein-Barr virus, HPV16, HPV18, and MLV vectors. However, the integration of the human immunodeficiency virus (HIV) in Giemsa bands and in FSs is not yet completely clear. This study aimed to assess the integration preferences of HIV in FSs and in Giemsa bands using an in silico study. HIV integration positions from Jurkat cells were used and two nonparametric tests were applied to compare HIV integration in dark versus light bands and in FS versus non-FS (NFSs). The results show that light bands are preferential targets for integration of HIV-1 in Jurkat cells and also that it integrates with equal intensity in FSs and in NFSs. The data indicates that HIV displays different preferences for FSs compared to other viruses. The aim was to develop and apply an approach to predict the conditions and constraints of HIV insertion in the human genome which seems to adequately complement empirical data. PMID:27294106

  12. Statistical assessment of seafront and beach water quality of Mumbai, India.

    Science.gov (United States)

    Vijay, Ritesh; Kamble, Swapnil R; Dhage, S S; Sohony, R A; Wate, S R

    2011-01-01

    The water quality of seafronts and beaches of Mumbai is under pressure and deteriorating due to discharge of partially treated sewage and wastewater through point and nonpoint sources. The objective of the study was to assess the water quality and to correlate physico-chemical and bacteriological parameters for establishing relationship, association and dependence on each other. The water quality parameters were selected as per SW II standards specified by Central Pollution Control Board, India and nutrient parameters as strong indicators of sewage pollution. Box and whisker plots were generated for evaluating spatio temporal variation of water quality which suggest influence of organic pollution mostly at Mahim and Dadar in the form of outliers and extremes. Pearson's correlations were estimated between parameters and found significant correlation with each other indicating influence of sewage on water quality. The water quality of beaches and seafronts were found unsafe for recreational purposes. The study suggested that designated water quality can be achieved by restricting nonpoint source through improvement in wastewater collection systems, appropriate level of treatment and proper disposal.

  13. Reliability Analysis and Overload Capability Assessment of Oil-Immersed Power Transformers

    Directory of Open Access Journals (Sweden)

    Chen Wang

    2016-01-01

    Full Text Available Smart grids have been constructed so as to guarantee the security and stability of the power grid in recent years. Power transformers are a most vital component in the complicated smart grid network. Any transformer failure can cause damage of the whole power system, within which the failures caused by overloading cannot be ignored. This research gives a new insight into overload capability assessment of transformers. The hot-spot temperature of the winding is the most critical factor in measuring the overload capacity of power transformers. Thus, the hot-spot temperature is calculated to obtain the duration running time of the power transformers under overloading conditions. Then the overloading probability is fitted with the mature and widely accepted Weibull probability density function. To guarantee the accuracy of this fitting, a new objective function is proposed to obtain the desired parameters in the Weibull distributions. In addition, ten different mutation scenarios are adopted in the differential evolutionary algorithm to optimize the parameter in the Weibull distribution. The final comprehensive overload capability of the power transformer is assessed by the duration running time as well as the overloading probability. Compared with the previous studies that take no account of the overloading probability, the assessment results obtained in this research are much more reliable.

  14. Validating Student Score Inferences with Person-Fit Statistic and Verbal Reports: A Person-Fit Study for Cognitive Diagnostic Assessment

    Science.gov (United States)

    Cui, Ying; Roberts, Mary Roduta

    2013-01-01

    The goal of this study was to investigate the usefulness of person-fit analysis in validating student score inferences in a cognitive diagnostic assessment. In this study, a two-stage procedure was used to evaluate person fit for a diagnostic test in the domain of statistical hypothesis testing. In the first stage, the person-fit statistic, the…

  15. a Meteorological Risk Assessment Method for Power Lines Based on GIS and Multi-Sensor Integration

    Science.gov (United States)

    Lin, Zhiyong; Xu, Zhimin

    2016-06-01

    Power lines, exposed in the natural environment, are vulnerable to various kinds of meteorological factors. Traditional research mainly deals with the influence of a single meteorological condition on the power line, which lacks of comprehensive effects evaluation and analysis of the multiple meteorological factors. In this paper, we use multiple meteorological monitoring data obtained by multi-sensors to implement the meteorological risk assessment and early warning of power lines. Firstly, we generate meteorological raster map from discrete meteorological monitoring data using spatial interpolation. Secondly, the expert scoring based analytic hierarchy process is used to compute the power line risk index of all kinds of meteorological conditions and establish the mathematical model of meteorological risk. By adopting this model in raster calculator of ArcGIS, we will have a raster map showing overall meteorological risks for power line. Finally, by overlaying the power line buffer layer to that raster map, we will get to know the exact risk index around a certain part of power line, which will provide significant guidance for power line risk management. In the experiment, based on five kinds of observation data gathered from meteorological stations in Guizhou Province of China, including wind, lightning, rain, ice, temperature, we carry on the meteorological risk analysis for the real power lines, and experimental results have proved the feasibility and validity of our proposed method.

  16. The Statistical Power of Inclusive Composite Interval Mapping in Detecting Digenic Epistasis Showing Common F2 Segregation Ratios

    Institute of Scientific and Technical Information of China (English)

    Luyan Zhang; Huihui Li; Jiankang Wang

    2012-01-01

    Epistasis is a commonly observed genetic phenomenon and an important source of variation of complex traits,which could maintain additive variance and therefore assure the long-term genetic gain in breeding.Inclusive composite interval mapping (ICIM) is able to identify epistatic quantitative trait loci (QTLs) no matter whether the two interacting QTLs have any additive effects.In this article,we conducted a simulation study to evaluate detection power and false discovery rate (FDR) of ICIM epistatic mapping,by considering F2 and doubled haploid (DH) populations,different F2 segregation ratios and population sizes.Results indicated that estimations of QTL locations and effects were unbiased,and the detection power of epistatic mapping was largely affected by population size,heritability of epistasis,and the amount and distribution of genetic effects.When the same likelihood of odd (LOD) threshold was used,detection power of QTL was higher in F2 population than power in DH population; meanwhile FDR in F2 was also higher than that in DH.The increase of marker density from 10 cM to 5 cM led to similar detection power but higher FDR.In simulated populations,ICIM achieved better mapping results than multiple interval mapping (MIM) in estimation of QTL positions and effect.At the end,we gave epistatic mapping results of ICIM in one actual population in rice (Oryza sativa L.).

  17. Adaptive ultra-short-term wind power prediction based on risk assessment

    DEFF Research Database (Denmark)

    Xue, Yusheng; Yu, Chen; Li, Kang;

    2016-01-01

    A risk assessment based adaptive ultra-short-term wind power prediction (USTWPP) method is proposed in this paper. The method first extracts features from the historical data, and split every wind power time series (WPTS) into several subsets defined by their stationary patterns. A WPTS that does...... not match with any of the stationary patterns is then included into a subset of non-stationary patterns. Every WPTS subset is then related to a USTWPP model which is specially selected and optimized offline based on the proposed risk assessment index. For on-line applications, the pattern of the last short...

  18. New online voltage stability margins and risk assessment for multi-bus smart power grids

    Science.gov (United States)

    Aldeen, M.; Saha, S.; Alpcan, T.; Evans, R. J.

    2015-07-01

    This paper presents a quantitative framework for assessment of voltage stability in smart power networks. First, new stability indices similar to gain and phase margins in linear time invariant control systems are introduced. These indices can be computed online in conjunction with load-flow calculations. Then, a novel risk assessment framework incorporating the new stability indices is developed to methodologically quantify the voltage stability risks a power system faces at any given operating condition. In contrast to existing local stability indices and qualitative risk approaches, the indices and framework introduced provide analytical and quantitative evaluation of voltage stability and associated risks. The results are illustrated with a numerical example.

  19. Development of creep damage assessment system for aged thermal power plant

    Energy Technology Data Exchange (ETDEWEB)

    Nonaka, Isamu [IshikawaJima-Harima Heavy Industries Co., Ltd., Tokyo (Japan); Umaki, Hideo [Ishikawajima-Harima Heavy Industries Co., Ltd., Tokyo (Japan); Nishida, Hidetalca [The Chugoku Electric Power Co., Inc., Hiroshima (Japan); Yamaguchi, Hiroshi [The Chugoku Electric Power Co., Inc., Hiroshima (Japan)

    1998-12-31

    IHI has developed the Creep Damage Assessment System to identify voids by processing an image observed by a small laser microscope with an advanced image processing technique jointly with Chugoku Electric Power Co., Inc. The result can be obtained immediately on the spot. Application tests of the system at the Unit No.3 boiler of the Kudamatsu Power Station showed good operability, adaptability to the environment, and accuracy. The new system can easily indicate damage conditions in parts during the periodical inspection, allowing rapid maintenance. Time reduction required for assessment and increased reliability of equipment can be also achieved. (orig.)

  20. Towards a more consolidated approach to material data management in life assessment of power plant components

    Energy Technology Data Exchange (ETDEWEB)

    Jovanovic, A.; Maile, K. [MPA Stuttgart (Germany)

    1998-12-31

    The presentation discusses the necessity of having a more consolidated (unified, possibly `European`) framework for all (not only pure experimental) material data needed for optimized life management and assessment of high-temperature and other components in power and process plants. After setting the main requirements for such a system, a description of efforts done in this direction at MPA Stuttgart in the area of high-temperature components in power plants is given. Furthermore, a reference to other relevant efforts elsewhere is made and an example of practical application of the proposed solution described (optimized material selection and life assessment of high-temperature piping). (orig.) 10 refs.

  1. OSIRIS and SOMBRERO Inertial Fusion Power Plant Designs, Volume 2: Designs, Assessments, and Comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Meier, W. R.; Bieri, R. L.; Monsler, M. J.; Hendricks, C. D.; Laybourne, P.; Shillito, K. R.

    1992-03-01

    This is a comprehensive design study of two Inertial Fusion Energy (IFE) electric power plants. Conceptual designs are presented for a fusion reactor (called Osiris) using an induction-linac heavy-ion beam driver, and another (called SOMBRERO) using a KrF laser driver. The designs covered all aspects of IFE power plants, including the chambers, heat transport and power conversion systems, balance-of-plant facilities, target fabrication, target injection and tracking, as well as the heavy-ion and KrF drivers. The point designs were assessed and compared in terms of their environmental & safety aspects, reliability and availability, economics, and technology development needs.

  2. Test-bed Assessment of Communication Technologies for a Power-Balancing Controller

    DEFF Research Database (Denmark)

    Findrik, Mislav; Pedersen, Rasmus; Hasenleithner, Eduard

    2016-01-01

    Due to growing need for sustainable energy, increasing number of different renewable energy resources are being connected into distribution grids. In order to efficiently manage a decentralized power generation units, the smart grid will rely on communication networks for information exchange...... and control. In this paper, we present a Smart Grid test-bed that integrates various communication technologies and deploys a power balancing controller for LV grids. Control performance of the introduced power balancing controller is subsequently investigated and its robustness to communication network cross......-traffic is evaluated. Various scenarios are demonstrated, assessing impact of communication network performance on quality of control....

  3. Dynamic Security Assessment of Western Danish Power System Based on Ensemble Decision Trees

    DEFF Research Database (Denmark)

    Liu, Leo; Bak, Claus Leth; Chen, Zhe

    2014-01-01

    With the increasing penetration of renewable energy resources and other forms of dispersed generation, more and more uncertainties will be brought to the dynamic security assessment (DSA) of power systems. This paper proposes an approach that uses ensemble decision trees (EDT) for online DSA. Fed...... with outlier identification show high accuracy in the presence of variance and uncertainties due to wind power generation and other dispersed generation units. The performance of this approach is demonstrated on the operational model of western Danish power system with the scale of around 200 lines and 400...

  4. Using ultrasound backscattering signals and Nakagami statistical distribution to assess regional cataract hardness.

    Science.gov (United States)

    Caixinha, Miguel; Jesus, Danilo A; Velte, Elena; Santos, Mário J; Santos, Jaime B

    2014-12-01

    This study aims to analyze the protein aggregates spatial distribution for different cataract degrees, and correlate this information with the lens acoustical parameters and by this way, assess the cataract regional hardness. Different cataract degrees were induced ex vivo in porcine lenses. A 25 MHz ultrasonic transducer was used to obtain the acoustical parameters (velocity, attenuation, and backscattering signals). B-scan and Nakagami images were constructed. Also, lenses with different cataract degrees were sliced in two regions (nucleus and cortex), for fibers and collagen detection. A significant increase with cataract formation was found for the velocity, attenuation, and brightness intensity of the B-scan images and Nakagami m parameter ( ). The acoustical parameters showed a good to moderate correlation with the m parameter for the different stages of cataract formation. A strong correlation was found between the protein aggregates in the cortex and the m parameter. Lenses without cataract are characterized using a classification and regression tree, by a mean brightness intensity ≤0.351, a variance of the B-scan brightness intensity ≤0.070, a velocity ≤1625 m/s, and an attenuation ≤0.415 dB/mm·MHz (sensitivity: 100% and specificity: 72.6%). To characterize different cataract degrees, the m parameter should be considered. Initial stages of cataract are characterized by a mean brightness intensity >0.351 and a variance of the m parameter >0.110. Advanced stages of cataract are characterized by a mean brightness intensity >0.351, a variance of the m parameter ≤0.110, and a mean m parameter >0.374. For initial and advanced stages of cataract, a sensitivity of 78.4% and a specificity of 86.5% are obtained.

  5. Tract-based spatial statistics to assess the neuroprotective effect of early erythropoietin on white matter development in preterm infants.

    Science.gov (United States)

    O'Gorman, Ruth L; Bucher, Hans U; Held, Ulrike; Koller, Brigitte M; Hüppi, Petra S; Hagmann, Cornelia F

    2015-02-01

    Despite improved survival, many preterm infants undergo subsequent neurodevelopmental impairment. To date, no neuroprotective therapies have been implemented into clinical practice. Erythropoietin, a haematopoietic cytokine used for treatment of anaemia of prematurity, has been shown to have neuroprotective and neuroregenerative effects on the brain in many experimental studies. The aim of the study was to assess the effect of recombinant human erythropoietin on the microstructural development of the cerebral white matter using tract-based spatial statistics performed at term equivalent age. A randomized, double-blind placebo-controlled, prospective multicentre study applying recombinant human erythropoietin in the first 42 h after preterm birth entitled 'Does erythropoietin improve outcome in preterm infant' was conducted in Switzerland (NCT00413946). Preterm infants were given recombinant human erythropoietin (3000 IU) or an equivalent volume of placebo (NaCl 0.9%) intravenously before 3 h of age after birth, at 12-18 h and at 36-42 h after birth. High resolution diffusion tensor imaging was obtained at 3 T in 58 preterm infants with mean (standard deviation) gestational age at birth 29.75 (1.44) weeks, and at scanning at 41.1 (2.09) weeks. Imaging was performed at a single centre. Voxel-wise statistical analysis of the fractional anisotropy data was carried out using tract-based spatial statistics to test for differences in fractional anisotropy between infants treated with recombinant human erythropoietin and placebo using a general linear model, covarying for the gestational age at birth and the corrected gestational age at the time of the scan. Preterm infants treated with recombinant human erythropoietin demonstrated increased fractional anisotropy in the genu and splenium of the corpus callosum, the anterior and posterior limbs of the internal capsule, and the corticospinal tract bilaterally. Mean fractional anisotropy was significantly higher in preterm

  6. Power assessment of lower limbs and strength asymmetry of soccer goalkeepers

    Directory of Open Access Journals (Sweden)

    František Zahálka

    2013-06-01

    Full Text Available BACKGROUND: Effective execution of vertical jump depends on the explosive power of lower limbs and their symmetrical integration mainly. Assessment of lower extremity bilateral asymmetries in soccer players is important for both injury prevention and performance. OBJECTIVE: The aim of this study was to identify and compare parameters of lower limb power in three different jump tests in elite soccer goalkeepers. The next aim was to describe and compare strength asymmetries of force exerted by lower limbs in the take-off phase in all tests. METHOD: The research group consisted of 25 elite soccer goalkeepers (age 26.5 ± 9.1 years, height 186.1 ± 7.8 cm, weight 86.7 ± 14.8 kg. Three types of a vertical jump – countermovement jump with arms included (CMJFA, countermovement jump with arms excluded (CMJ and squat jump (SJ were performed on two force platforms. Following parameters were assessed – maximum force during the take-off phase Fmax (N and their relative value Frel (N • kg–1, jump height h (m and force asymmetry between limbs (∆Fmax. RM ANOVA was used in statistical analysis. RESULTS: The type of jump had a significant effect on jump height (F2, 48 = 109.66, p < .01, η2 = .82. The highest jump was reached in CMJFA. This result was higher by 11.1% (5.01 cm in comparison to CMJ and by 19.9% (8.98 cm than in SJ. Type of jump significantly influenced Fmax (F1.6, 38.7 = 44.29, p < .01, η2 = .65 and Frel (F2, 48 = 50.33, p < .01, η2 = .68. Force asymmetry between limbs (∆Fmax was significantly different with respect to the type of jump performed (F1.3, 31.7 = 5.14, p < .05, η2 = .18. The highest force asymmetry was found in CMJFA test (∆Fmax = 8.61%, while the difference in CMJ test was (7.06% and in SJ test (∆Fmax = 3.95%. We found a significantly greater difference in ∆Fmax between CMJFA vs. SJ (p < .05 and CMJ vs. SJ (p < .01

  7. Predict! Teaching Statistics Using Informational Statistical Inference

    Science.gov (United States)

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  8. The Comparative Life Cycle Assessment of Power Generation from Lignocellulosic Biomass

    Directory of Open Access Journals (Sweden)

    Xinhua Shen

    2015-09-01

    Full Text Available In order to solve the energy crisis and reduce emissions of greenhouse gases (GHG, renewable energy resources are exploited for power generation. Because lignocellulosic biomass resources are abundant and renewable, various technologies are applied to using lignocellulosic biomass to derive biofuel and electricity. This paper focuses on power generation from lignocellulosic biomass and comparison of the effects of different feedstocks, transportation, and power generation technologies evaluated through life cycle assessment (LCA. The inputs and boundaries of LCA vary with different feedstocks, such as forestry wood, agricultural residues, and fast-growing grass. For agricultural residues and fast-growing grass, the transportation cost from field to power plant is more critical. Three technologies for power generation are analyzed both with and without pelletization of lignocellulosic biomass. The GHG emissions also vary with different feedstocks and depend on burning technologies at different plant scales. The daily criteria pollutant emissions of power generation from different lignocellulosic biomass were evaluated with a life cycle assessment model of GREET.net 2014. It is concluded that bio-power generation is critical with the urgency of greenhouse effects.

  9. Outcomes of an international initiative for harmonization of low power and shutdown probabilistic safety assessment

    Directory of Open Access Journals (Sweden)

    Manna Giustino

    2010-01-01

    Full Text Available Many probabilistic safety assessment studies completed to the date have demonstrated that the risk dealing with low power and shutdown operation of nuclear power plants is often comparable with the risk of at-power operation, and the main contributors to the low power and shutdown risk often deal with human factors. Since the beginning of the nuclear power generation, human performance has been a very important factor in all phases of the plant lifecycle: design, commissioning, operation, maintenance, surveillance, modification, decommissioning and dismantling. The importance of this aspect has been confirmed by recent operating experience. This paper provides the insights and conclusions of a workshop organized in 2007 by the IAEA and the Joint Research Centre of the European Commission, on Harmonization of low power and shutdown probabilistic safety assessment for WWER nuclear power plants. The major objective of the workshop was to provide a comparison of the approaches and the results of human reliability analyses and gain insights in the enhanced handling of human factors.

  10. Some problems of the theory of quantum statistical systems with an energy spectrum of the fractional-power type

    Science.gov (United States)

    Alisultanov, Z. Z.; Meilanov, R. P.

    2012-10-01

    We consider the problem of the effective interaction potential in a quantum many-particle system leading to the fractional-power dispersion law. We show that passing to fractional-order derivatives is equivalent to introducing a pair interparticle potential. We consider the case of a degenerate electron gas. Using the van der Waals equation, we study the equation of state for systems with a fractional-power spectrum. We obtain a relation between the van der Waals constant and the phenomenological parameter α, the fractional-derivative order. We obtain a relation between energy, pressure, and volume for such systems: the coefficient of the thermal energy is a simple function of α. We consider Bose—Einstein condensation in a system with a fractional-power spectrum. The critical condensation temperature for 1 ideal system, where α = 2.

  11. 75 FR 14637 - James A. FitzPatrick Nuclear Power Plant; Environmental Assessment and Finding of No Significant...

    Science.gov (United States)

    2010-03-26

    ... COMMISSION James A. FitzPatrick Nuclear Power Plant; Environmental Assessment and Finding of No Significant...), for the operation of the James A. FitzPatrick Nuclear Power Plant (JAFNPP) located in Oswego County... related to operation of James A. FitzPatrick Nuclear Power Plant Power Authority of the State of New...

  12. Investment risk analysis of China's wind power industry based on pre-assessment matrix

    Institute of Scientific and Technical Information of China (English)

    Yang Yong; Jiang Dongmei; Geng Jie; Fan Hua; Zhang Fashu

    2009-01-01

    Wind energy is a clean and sustainable energy, and wind power does not rely on fossil fuels.So there is no fuel price risk, and it, of course, does not include the environmental costs, such as carbon emissions.Because of these unique advantages, wind power has gradually become an important part of the strategy of sustainable development in China.Now with the growing voices on global greenhouse gas emission reduction, and as a clean and efficient energy,wind power has huge potential in combating climate change, energy security pressures and the needs for energy Wind power in China began to develop from the 1980s.In the first 20 years, the speed of development was slow;but since 2004, it has had an extremely rapid growth.This paper, in order to study the development mechanism of China's wind power industry, investigated and analyzed the status quo of wind power industry in China, and then found that(1)the development trend of wind power industry in China appears exponential growth:(2) China's installed capactiy of wind power is still smaller than that os some other countries;(3) new subsidy policies bring developing opportunities to wind power industry in China;(4) the sectors of wind power industry are in unbalanced growing;(5) the owners of proposed wind farms are too optimistic though the built wind farm had many problems.In addition, by using the methodology of Game Theory, this paper has also constructed the matrix of pre-assessing risks of China's wind power industry to further discuss the potential risk fuctors within China's wind power industry as risk factors of wind farm construction, risk factors of production of wind turbines, risk factors of parts and components manufacturing industry under risk indicators like R&D, patents, the domestic policy, the international policy, the quality of products and the market regulation, in order to provide a scientific assessment and self-assessment tool for investors or implementers and also to promote the further

  13. Preliminary environmental assessment for the satellite power system (SPS). Revision 1. Volume 1. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    A preliminary assessment of the environmental impacts of the proposed satellite power system (SPS) is summarized here. In this system, satellites would collect solar energy in space, convert it to microwaves, and transmit the microwaves to receiving antennas (rectennas) on earth. At the rectennas, the microwaves would be converted to electricity. The assessment considers microwave and nonmicrowave effects on the terrestrial environment and human health, atmospheric effects, and disruption of communications and other electromagnetic systems.

  14. The mediation of environmental assessment's influence: What role for power?

    Energy Technology Data Exchange (ETDEWEB)

    Cashmore, Matthew, E-mail: cashmore@plan.aau.dk [Danish Centre for Environmental Assessment, Department of Development and Planning, Aalborg University Copenhagen, A.C. Meyers Vaenge 15, DK-2450 Copenhagen SV (Denmark); Axelsson, Anna [Naturskyddsforeningen, Box 4625, 116 91 Stockholm (Sweden)

    2013-02-15

    Considerable empirical research has been conducted on why policy tools such as environmental assessment (EA) often appear to have 'little effect' (after Weiss) on policy decisions. This article revisits this debate but looks at a mediating factor that has received limited attention to-date in the context of EA - political power. Using a tripartite analytical framework, a comparative analysis of the influence and significance of power in mediating environmental policy integration is undertaken. Power is analysed, albeit partially, through an exploration of institutions that underpin social order. Empirically, the research examines the case of a new approach to policy-level EA (essentially a form of Strategic Environmental Assessment) developed by the World Bank and its trial application to urban environmental governance and planning in Dhaka mega-city, Bangladesh. The research results demonstrate that power was intimately involved in mediating the influence of the policy EA approach, in both positive (enabling) and negative (constraining) ways. It is suggested that the policy EA approach was ultimately a manifestation of a corporate strategy to maintain the powerful position of the World Bank as a leading authority on international development which focuses on knowledge generation. Furthermore, as constitutive of an institution and reflecting the worldviews of its proponents, the development of a new approach to EA also represents a significant power play. This leads us to, firstly, emphasise the concepts of strategy and intentionality in theorising how and why EA tools are employed, succeed and fail; and secondly, reflect on the reasons why power has received such limited attention to-date in EA scholarship. - Highlights: Black-Right-Pointing-Pointer Conducts empirical research on the neglected issue of power. Black-Right-Pointing-Pointer Employs an interpretation of power in which it is viewed as a productive phenomenon. Black

  15. Self-reported gait unsteadiness in mildly impaired neurological patients: an objective assessment through statistical gait analysis

    Directory of Open Access Journals (Sweden)

    Benedetti Maria

    2012-08-01

    Full Text Available Abstract Background Self-reported gait unsteadiness is often a problem in neurological patients without any clinical evidence of ataxia, because it leads to reduced activity and limitations in function. However, in the literature there are only a few papers that address this disorder. The aim of this study is to identify objectively subclinical abnormal gait strategies in these patients. Methods Eleven patients affected by self-reported unsteadiness during gait (4 TBI and 7 MS and ten healthy subjects underwent gait analysis while walking back and forth on a 15-m long corridor. Time-distance parameters, ankle sagittal motion, and muscular activity during gait were acquired by a wearable gait analysis system (Step32, DemItalia, Italy on a high number of successive strides in the same walk and statistically processed. Both self-selected gait speed and high speed were tested under relatively unconstrained conditions. Non-parametric statistical analysis (Mann–Whitney, Wilcoxon tests was carried out on the means of the data of the two examined groups. Results The main findings, with data adjusted for velocity of progression, show that increased double support and reduced velocity of progression are the main parameters to discriminate patients with self-reported unsteadiness from healthy controls. Muscular intervals of activation showed a significant increase in the activity duration of the Rectus Femoris and Tibialis Anterior in patients with respect to the control group at high speed. Conclusions Patients with a subjective sensation of instability, not clinically documented, walk with altered strategies, especially at high gait speed. This is thought to depend on the mechanisms of postural control and coordination. The gait anomalies detected might explain the symptoms reported by the patients and allow for a more focused treatment design. The wearable gait analysis system used for long distance statistical walking assessment was able to detect

  16. A probabilistic seismic risk assessment procedure for nuclear power plants: (II) Application

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    This paper presents the procedures and results of intensity- and time-based seismic risk assessments of a sample nuclear power plant (NPP) to demonstrate the risk-assessment methodology proposed in its companion paper. The intensity-based assessments include three sets of sensitivity studies to identify the impact of the following factors on the seismic vulnerability of the sample NPP, namely: (1) the description of fragility curves for primary and secondary components of NPPs, (2) the number of simulations of NPP response required for risk assessment, and (3) the correlation in responses between NPP components. The time-based assessment is performed as a series of intensity-based assessments. The studies illustrate the utility of the response-based fragility curves and the inclusion of the correlation in the responses of NPP components directly in the risk computation. ?? 2011 Published by Elsevier B.V.

  17. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  18. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets...... with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved...

  19. Assessment of ionospheric Joule heating by GUMICS-4 MHD simulation, AMIE, and satellite-based statistics: towards a synthesis

    Directory of Open Access Journals (Sweden)

    M. Palmroth

    2005-09-01

    Full Text Available We investigate the Northern Hemisphere Joule heating from several observational and computational sources with the purpose of calibrating a previously identified functional dependence between solar wind parameters and ionospheric total energy consumption computed from a global magnetohydrodynamic (MHD simulation (Grand Unified Magnetosphere Ionosphere Coupling Simulation, GUMICS-4. In this paper, the calibration focuses on determining the amount and temporal characteristics of Northern Hemisphere Joule heating. Joule heating during a substorm is estimated from global observations, including electric fields provided by Super Dual Auroral Network (SuperDARN and Pedersen conductances given by the ultraviolet (UV and X-ray imagers on board the Polar satellite. Furthermore, Joule heating is assessed from several activity index proxies, large statistical surveys, assimilative data methods (AMIE, and the global MHD simulation GUMICS-4. We show that the temporal and spatial variation of the Joule heating computed from the GUMICS-4 simulation is consistent with observational and statistical methods. However, the different observational methods do not give a consistent estimate for the magnitude of the global Joule heating. We suggest that multiplying the GUMICS-4 total Joule heating by a factor of 10 approximates the observed Joule heating reasonably well. The lesser amount of Joule heating in GUMICS-4 is essentially caused by weaker Region 2 currents and polar cap potentials. We also show by theoretical arguments that multiplying independent measurements of averaged electric fields and Pedersen conductances yields an overestimation of Joule heating.

    Keywords. Ionosphere (Auroral ionosphere; Modeling and forecasting; Electric fields and currents

  20. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, Martin H.; Hahmann, Andrea N.; Nielsen, Torben S.;

    This poster presents the Public Service Obligation (PSO) funded project PSO 10464 "Integrated Wind Power Planning Tool". The project goal is to integrate a Numerical Weather Prediction (NWP) model with statistical tools in order to assess wind power fluctuations, with focus on short term...... forecasting for existing wind farms, as well as long term power system planning for future wind farms....