WorldWideScience

Sample records for sufficiency statistics

  1. Information geometry and sufficient statistics

    Czech Academy of Sciences Publication Activity Database

    Ay, N.; Jost, J.; Le, Hong-Van; Schwachhöfer, L.

    2015-01-01

    Roč. 162, 1-2 (2015), s. 327-364 ISSN 0178-8051 Institutional support: RVO:67985840 Keywords : Fisher quadratic form * Amari-Chentsov tensor * sufficient statistic Subject RIV: BA - General Mathematics Impact factor: 2.204, year: 2015 http://link.springer.com/article/10.1007/s00440-014-0574-8

  2. Role of sufficient statistics in stochastic thermodynamics and its implication to sensory adaptation

    Science.gov (United States)

    Matsumoto, Takumi; Sagawa, Takahiro

    2018-04-01

    A sufficient statistic is a significant concept in statistics, which means a probability variable that has sufficient information required for an inference task. We investigate the roles of sufficient statistics and related quantities in stochastic thermodynamics. Specifically, we prove that for general continuous-time bipartite networks, the existence of a sufficient statistic implies that an informational quantity called the sensory capacity takes the maximum. Since the maximal sensory capacity imposes a constraint that the energetic efficiency cannot exceed one-half, our result implies that the existence of a sufficient statistic is inevitably accompanied by energetic dissipation. We also show that, in a particular parameter region of linear Langevin systems there exists the optimal noise intensity at which the sensory capacity, the information-thermodynamic efficiency, and the total entropy production are optimized at the same time. We apply our general result to a model of sensory adaptation of E. coli and find that the sensory capacity is nearly maximal with experimentally realistic parameters.

  3. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    Science.gov (United States)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  4. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  5. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains

    DEFF Research Database (Denmark)

    Tataru, Paula Cristina; Hobolth, Asger

    2011-01-01

    past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. RESULTS: We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned......BACKGROUND: Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications...... of the algorithms is available at www.birc.au.dk/~paula/. CONCLUSIONS: We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually...

  6. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains.

    Science.gov (United States)

    Tataru, Paula; Hobolth, Asger

    2011-12-05

    Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD), the second on uniformization (UNI), and the third on integrals of matrix exponentials (EXPM). The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  7. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains

    Directory of Open Access Journals (Sweden)

    Tataru Paula

    2011-12-01

    Full Text Available Abstract Background Continuous time Markov chains (CTMCs is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes are unaccessible and the past must be inferred from DNA sequence data observed in the present. Results We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD, the second on uniformization (UNI, and the third on integrals of matrix exponentials (EXPM. The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. Conclusions We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  8. The Relationship between Organizational Support Perceptions and Self-Sufficiencies of Logistics Sector Employees

    Directory of Open Access Journals (Sweden)

    Sefer Gumus

    2016-01-01

    Full Text Available This study was performed in order to examine the relationship between organizational support perceptions and self-sufficiency levels of logistics sector employees and to determine whether organizational support perceptions and self-sufficiency levels of employees differ according to some specification. The questionnaire form consisting of perceived organizational support scale in accordance with the purpose, general self-sufficiency scale and personal information form, was applied to 124 employees of 3 separate logistics firms operating in Istanbul. The data obtained from the questionnaire were analyzed using SPSS17.0 statistical software package on computer. In the assessment of data, descriptive characteristics of employees were determined by frequency and percentage statistics and the self-sufficiency and perceived organizational support levels by the mean and standard deviation statistics. The t test, Tukey test and one-way Anova tests were utilized in determining employees' self-sufficiency and perceived organizational support levels differentiation according to descriptive characteristics, and correlation analysis was utilized in determining the relationship between self-sufficiency and perceived organizational support levels of employees. In conclusion, it was determined that there was statistical relationship between organizational support and self-sufficiency levels perceived by logistics sector employees. Accordingly, when employees' perceived organizational support levels increase then self-sufficiency levels also increase, and when perceived organizational support levels decrease then self-sufficiency levels also decrease.

  9. Shot Group Statistics for Small Arms Applications

    Science.gov (United States)

    2017-06-01

    if its probability distribution is known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown... statistical inference on the unknown, population standard deviations of the x and y impact-point positions. The dispersion measures treated in this report...known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown, population standard deviations of the x and y

  10. Sufficiency Grounded as Sufficiently Free: A Reply to Shlomi Segall

    DEFF Research Database (Denmark)

    Nielsen, Lasse

    2016-01-01

    be grounded on (i) any personal value, nor (ii) any impersonal value. Consequently, sufficientarianism is groundless. This article contains a rejoinder to this critique. Its main claim is that the value of autonomy holds strong potential for grounding sufficiency. It argues, firstly, that autonomy carries...... both personal value for its recipient as well as impersonal value, and that both of these values are suitable for grounding sufficiency. It thus follows that we should reject both (i) and (ii). Secondly, although autonomy is presumably the strongest candidate for grounding sufficiency, the article...... provides some counterargument to Segall’s rejection of the other candidates—the impersonal value of virtue; the personal value for the allocator; and the personal value for others. If the arguments are sound, they show that we need not worry about sufficientarianism being groundless....

  11. Entrepreneurship by any other name: self-sufficiency versus innovation.

    Science.gov (United States)

    Parker Harris, Sarah; Caldwell, Kate; Renko, Maija

    2014-01-01

    Entrepreneurship has been promoted as an innovative strategy to address the employment of people with disabilities. Research has predominantly focused on the self-sufficiency aspect without fully integrating entrepreneurship literature in the areas of theory, systems change, and demonstration projects. Subsequently there are gaps in services, policies, and research in this field that, in turn, have limited our understanding of the support needs and barriers or facilitators of entrepreneurs with disabilities. A thorough analysis of the literature in these areas led to the development of two core concepts that need to be addressed in integrating entrepreneurship into disability employment research and policy: clarity in operational definitions and better disability statistics and outcome measures. This article interrogates existing research and policy efforts in this regard to argue for a necessary shift in the field from focusing on entrepreneurship as self-sufficiency to understanding entrepreneurship as innovation.

  12. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  13. Retrocausal Effects as a Consequence of Quantum Mechanics Refined to Accommodate the Principle of Sufficient Reason

    Energy Technology Data Exchange (ETDEWEB)

    Stapp, Henry P.

    2011-05-10

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.

  14. Retrocausal Effects as a Consequence of Quantum Mechanics Refined to Accommodate the Principle of Sufficient Reason

    International Nuclear Information System (INIS)

    Stapp, Henry P.

    2011-01-01

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.

  15. Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.

    Science.gov (United States)

    Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R

    2007-12-01

    After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.

  16. Retrocausal Effects As A Consequence of Orthodox Quantum Mechanics Refined To Accommodate The Principle Of Sufficient Reason

    Science.gov (United States)

    Stapp, Henry P.

    2011-11-01

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.

  17. Introduction to Statistics for Biomedical Engineers

    CERN Document Server

    Ropella, Kristina

    2007-01-01

    There are many books written about statistics, some brief, some detailed, some humorous, some colorful, and some quite dry. Each of these texts is designed for a specific audience. Too often, texts about statistics have been rather theoretical and intimidating for those not practicing statistical analysis on a routine basis. Thus, many engineers and scientists, who need to use statistics much more frequently than calculus or differential equations, lack sufficient knowledge of the use of statistics. The audience that is addressed in this text is the university-level biomedical engineering stud

  18. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  19. The statistical-inference approach to generalized thermodynamics

    International Nuclear Information System (INIS)

    Lavenda, B.H.; Scherer, C.

    1987-01-01

    Limit theorems, such as the central-limit theorem and the weak law of large numbers, are applicable to statistical thermodynamics for sufficiently large sample size of indipendent and identically distributed observations performed on extensive thermodynamic (chance) variables. The estimation of the intensive thermodynamic quantities is a problem in parametric statistical estimation. The normal approximation to the Gibbs' distribution is justified by the analysis of large deviations. Statistical thermodynamics is generalized to include the statistical estimation of variance as well as mean values

  20. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  1. Determinants of 25(OH)D sufficiency in obese minority children: selecting outcome measures and analytic approaches.

    Science.gov (United States)

    Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri

    2011-06-01

    To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D 25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.

  2. Statistical analysis of angular correlation measurements

    International Nuclear Information System (INIS)

    Oliveira, R.A.A.M. de.

    1986-01-01

    Obtaining the multipole mixing ratio, δ, of γ transitions in angular correlation measurements is a statistical problem characterized by the small number of angles in which the observation is made and by the limited statistic of counting, α. The inexistence of a sufficient statistics for the estimator of δ, is shown. Three different estimators for δ were constructed and their properties of consistency, bias and efficiency were tested. Tests were also performed in experimental results obtained in γ-γ directional correlation measurements. (Author) [pt

  3. EVALUATION OF FOOD SELF-SUFFICIENCY OF THE REPUBLIC OF TATARSTAN DISTRICTS

    Directory of Open Access Journals (Sweden)

    R. E. Mansurov

    2017-01-01

    Full Text Available The article presents the author's method for estimation of the level of food self-sufficiency for the main types of food products in the regions of Republic of Tatarstan. The proposed method is based on the use of analytical methods and mathematical comparative analysis to compose a final rating. The proposed method can be used in the system of regional management of agro-industrial complex on the federal and local level. Relevance. The relevance of this work is caused by on the one hand a hardening of foreign policy that may negatively impact on national food security, and on the other hand the state crisis of the domestic agricultural sector. All this requires the development of new approaches to regional agribusiness management. Goal. To develop a methodology is used to assess the level of food self-sufficiency. To rate the level of self-sufficiency in main types of foodstuff in regions of Republic of Tatarstan. Materials and Methods. Statistical data of the results of the AIC of the Republic of Tatarstan for 2016 was used for the study. Analytical methods, including mathematical analysis and comparison were used. Results. Based on the analysis of the present situation for ensuring of food security in Russia it was shown that now it is necessary to develop effective indicators identifying the level of self-sufficiency in basic food regions. It was also revealed that there are no such indicators in system of regional agrarian and industrial complex at present time. As a result of analysis existing approaches the author's method of rating the level of self-sufficiency of regions was offered. This method was adopted on the example of the Republic of Tatarstan. Conclusions. The proposed method of rating estimation of self-sufficiency for basic foodstuffs can be used in the regional agroindustrial complex management system at the federal and local level. It can be used to rank areas in terms of their self-sufficiency for basic foodstuffs. This

  4. Improving Statistical Literacy in Schools in Australia

    OpenAIRE

    Trewin, Dennis

    2005-01-01

    We live in the information age. Statistical thinking is a life skill that all Australian children should have. The Statistical Society of Australia (SSAI) and the Australian Bureau of Statistics (ABS) have been working on a strategy to ensure Australian school children acquire a sufficient understanding and appreciation of how data can be acquired and used so they can make informed judgements in their daily lives, as children and then as adults. There is another motive for our work i...

  5. MSEBAG: a dynamic classifier ensemble generation based on `minimum-sufficient ensemble' and bagging

    Science.gov (United States)

    Chen, Lei; Kamel, Mohamed S.

    2016-01-01

    In this paper, we propose a dynamic classifier system, MSEBAG, which is characterised by searching for the 'minimum-sufficient ensemble' and bagging at the ensemble level. It adopts an 'over-generation and selection' strategy and aims to achieve a good bias-variance trade-off. In the training phase, MSEBAG first searches for the 'minimum-sufficient ensemble', which maximises the in-sample fitness with the minimal number of base classifiers. Then, starting from the 'minimum-sufficient ensemble', a backward stepwise algorithm is employed to generate a collection of ensembles. The objective is to create a collection of ensembles with a descending fitness on the data, as well as a descending complexity in the structure. MSEBAG dynamically selects the ensembles from the collection for the decision aggregation. The extended adaptive aggregation (EAA) approach, a bagging-style algorithm performed at the ensemble level, is employed for this task. EAA searches for the competent ensembles using a score function, which takes into consideration both the in-sample fitness and the confidence of the statistical inference, and averages the decisions of the selected ensembles to label the test pattern. The experimental results show that the proposed MSEBAG outperforms the benchmarks on average.

  6. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  7. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    Science.gov (United States)

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  8. Statistics for experimentalists

    CERN Document Server

    Cooper, B E

    2014-01-01

    Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...

  9. Statistics and Title VII Proof: Prima Facie Case and Rebuttal.

    Science.gov (United States)

    Whitten, David

    1978-01-01

    The method and means by which statistics can raise a prima facie case of Title VII violation are analyzed. A standard is identified that can be applied to determine whether a statistical disparity is sufficient to shift the burden to the employer to rebut a prima facie case of discrimination. (LBH)

  10. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  11. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  12. Sufficiency does energy consumption become a moral issue?

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Adrian (Socio-economic Inst. and Univ. Research Priority Programme in Ethics, Univ. of Zuerich, Zuerich (Switzerland))

    2009-07-01

    Reducing the externalities from energy use is crucial for sustainability. There are basically four ways to reduce externalities from energy use: increasing technical efficiency ('energy input per unit energy service'), increasing economic efficiency ('internalising external costs'), using 'clean' energy sources with few externalities, or sufficiency ('identifying 'optimal' energy service levels'). A combination of those strategies is most promising for sustainable energy systems. However, the debate on sustainable energy is dominated by efficiency and clean energy strategies, while sufficiency plays a minor role. Efficiency and clean energy face several problems, though. Thus, the current debate should be complemented with a critical discussion of sufficiency. In this paper, I develop a concept of sufficiency, which is adequate for liberal societies. I focus on ethical foundations for sufficiency, as the discussion of such is missing or cursory only in the existing literature. I first show that many examples of sufficiency can be understood as (economic) efficiency, but that the two concepts do not coincide. I then show that sufficiency based on moralization of actions can be understood as implementation of the boundary conditions for social justice that come with notions of liberal societies, in particular the duty not to harm other people. By this, to increase sufficiency becomes a duty beyond individual taste. I further illustrate this in the context of the adverse effects of climate change as externalities from energy use.

  13. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...

  14. DYNAMIC SUFFICIENCY OF THE MAGNETICALLY SUSPENDED TRAIN

    Directory of Open Access Journals (Sweden)

    V. A. Polyakov

    2013-11-01

    Full Text Available Purpose. The basic criterion of the magnetically suspended train's consumer estimation is a quality of its mechanical motion. This motion is realized in unpredictable conditions and, for purposefulness preservation, should adapt to them. Such adaptation is possible only within the limits of system’s dynamic sufficiency. Sufficiency is understood as presence at system of resources, which allow one to realize its demanded motions without violating actual restrictions. Therefore presence of such resources is a necessary condition of preservation of required purposefulness of train's dynamics, and verification of the mentioned sufficiency is the major component of this dynamic research. Methodology. Methods of the set theory are used in work. Desirable and actual approachability spaces of the train are found. The train is considered dynamically sufficient in zones of the specified spaces overlapping. Findings. Within the limits of the accepted treatment of train's dynamic sufficiency, verification of its presence, as well as a stock (or deficiency of preservations can be executed by the search and the subsequent estimation of such overlapping zones. Operatively (directly during motion it can be realized on the train's ODC with use, for example, of computer mathematics system Mathematica. It possesses extensive opportunities of highly efficient and, at the same time, demanding an expense concerning small resources information manipulation. The efficiency of using of created technique is illustrated on an example of vehicle's acceleration research. Calculation is executed with use of the constructed computer model of interaction of an independent traction electromagnetic subsystem of an artifact with its mechanical subsystem. Originality. The technique of verification of the high-speed magnetically suspended train's dynamic sufficiency is developed. The technique is highly efficient, it provides sufficient presentation and demands an expense of the

  15. Self-sufficiency, free trade and safety.

    Science.gov (United States)

    Rautonen, Jukka

    2010-01-01

    The relationship between free trade, self-sufficiency and safety of blood and blood components has been a perennial discussion topic in the blood service community. Traditionally, national self-sufficiency has been perceived as the ultimate goal that would also maximize safety. However, very few countries are, or can be, truly self-sufficient when self-sufficiency is understood correctly to encompass the whole value chain from the blood donor to the finished product. This is most striking when plasma derived medicines are considered. Free trade of blood products, or competition, as such can have a negative or positive effect on blood safety. Further, free trade of equipment and reagents and several plasma medicines is actually necessary to meet the domestic demand for blood and blood derivatives in most countries. Opposing free trade due to dogmatic reasons is not in the best interest of any country and will be especially harmful for the developing world. Competition between blood services in the USA has been present for decades. The more than threefold differences in blood product prices between European blood services indicate that competition is long overdue in Europe, too. This competition should be welcomed but carefully and proactively regulated to avoid putting safe and secure blood supply at risk. Copyright 2009 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  16. Use of Monte Carlo Bootstrap Method in the Analysis of Sample Sufficiency for Radioecological Data

    International Nuclear Information System (INIS)

    Silva, A. N. C. da; Amaral, R. S.; Araujo Santos Jr, J.; Wilson Vieira, J.; Lima, F. R. de A.

    2015-01-01

    There are operational difficulties in obtaining samples for radioecological studies. Population data may no longer be available during the study and obtaining new samples may not be possible. These problems do the researcher sometimes work with a small number of data. Therefore, it is difficult to know whether the number of samples will be sufficient to estimate the desired parameter. Hence, it is critical do the analysis of sample sufficiency. It is not interesting uses the classical methods of statistic to analyze sample sufficiency in Radioecology, because naturally occurring radionuclides have a random distribution in soil, usually arise outliers and gaps with missing values. The present work was developed aiming to apply the Monte Carlo Bootstrap method in the analysis of sample sufficiency with quantitative estimation of a single variable such as specific activity of a natural radioisotope present in plants. The pseudo population was a small sample with 14 values of specific activity of 226 Ra in forage palm (Opuntia spp.). Using the R software was performed a computational procedure to calculate the number of the sample values. The re sampling process with replacement took the 14 values of original sample and produced 10,000 bootstrap samples for each round. Then was calculated the estimated average θ for samples with 2, 5, 8, 11 and 14 values randomly selected. The results showed that if the researcher work with only 11 sample values, the average parameter will be within a confidence interval with 90% probability . (Author)

  17. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  18. Post-Disaster Food and Nutrition from Urban Agriculture: A Self-Sufficiency Analysis of Nerima Ward, Tokyo.

    Science.gov (United States)

    Sioen, Giles Bruno; Sekiyama, Makiko; Terada, Toru; Yokohari, Makoto

    2017-07-10

    Background : Post-earthquake studies from around the world have reported that survivors relying on emergency food for prolonged periods of time experienced several dietary related health problems. The present study aimed to quantify the potential nutrient production of urban agricultural vegetables and the resulting nutritional self-sufficiency throughout the year for mitigating post-disaster situations. Methods : We estimated the vegetable production of urban agriculture throughout the year. Two methods were developed to capture the production from professional and hobby farms: Method I utilized secondary governmental data on agricultural production from professional farms, and Method II was based on a supplementary spatial analysis to estimate the production from hobby farms. Next, the weight of produced vegetables [t] was converted into nutrients [kg]. Furthermore, the self-sufficiency by nutrient and time of year was estimated by incorporating the reference consumption of vegetables [kg], recommended dietary allowance of nutrients per capita [mg], and population statistics. The research was conducted in Nerima, the second most populous ward of Tokyo's 23 special wards. Self-sufficiency rates were calculated with the registered residents. Results : The estimated total vegetable production of 5660 tons was equivalent to a weight-based self-sufficiency rate of 6.18%. The average nutritional self-sufficiencies of Methods I and II were 2.48% and 0.38%, respectively, resulting in an aggregated average of 2.86%. Fluctuations throughout the year were observed according to the harvest seasons of the available crops. Vitamin K (6.15%) had the highest self-sufficiency of selected nutrients, while calcium had the lowest (0.96%). Conclusions : This study suggests that depending on the time of year, urban agriculture has the potential to contribute nutrients to diets during post-disaster situations as disaster preparedness food. Emergency responses should be targeted

  19. Post-Disaster Food and Nutrition from Urban Agriculture: A Self-Sufficiency Analysis of Nerima Ward, Tokyo

    Directory of Open Access Journals (Sweden)

    Giles Bruno Sioen

    2017-07-01

    Full Text Available Background: Post-earthquake studies from around the world have reported that survivors relying on emergency food for prolonged periods of time experienced several dietary related health problems. The present study aimed to quantify the potential nutrient production of urban agricultural vegetables and the resulting nutritional self-sufficiency throughout the year for mitigating post-disaster situations. Methods: We estimated the vegetable production of urban agriculture throughout the year. Two methods were developed to capture the production from professional and hobby farms: Method I utilized secondary governmental data on agricultural production from professional farms, and Method II was based on a supplementary spatial analysis to estimate the production from hobby farms. Next, the weight of produced vegetables [t] was converted into nutrients [kg]. Furthermore, the self-sufficiency by nutrient and time of year was estimated by incorporating the reference consumption of vegetables [kg], recommended dietary allowance of nutrients per capita [mg], and population statistics. The research was conducted in Nerima, the second most populous ward of Tokyo’s 23 special wards. Self-sufficiency rates were calculated with the registered residents. Results: The estimated total vegetable production of 5660 tons was equivalent to a weight-based self-sufficiency rate of 6.18%. The average nutritional self-sufficiencies of Methods I and II were 2.48% and 0.38%, respectively, resulting in an aggregated average of 2.86%. Fluctuations throughout the year were observed according to the harvest seasons of the available crops. Vitamin K (6.15% had the highest self-sufficiency of selected nutrients, while calcium had the lowest (0.96%. Conclusions: This study suggests that depending on the time of year, urban agriculture has the potential to contribute nutrients to diets during post-disaster situations as disaster preparedness food. Emergency responses should be

  20. An audit of the statistics and the comparison with the parameter in the population

    Science.gov (United States)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  1. Statistical theory and inference

    CERN Document Server

    Olive, David J

    2014-01-01

    This text is for  a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful  tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.

  2. Annotations to quantum statistical mechanics

    CERN Document Server

    Kim, In-Gee

    2018-01-01

    This book is a rewritten and annotated version of Leo P. Kadanoff and Gordon Baym’s lectures that were presented in the book Quantum Statistical Mechanics: Green’s Function Methods in Equilibrium and Nonequilibrium Problems. The lectures were devoted to a discussion on the use of thermodynamic Green’s functions in describing the properties of many-particle systems. The functions provided a method for discussing finite-temperature problems with no more conceptual difficulty than ground-state problems, and the method was equally applicable to boson and fermion systems and equilibrium and nonequilibrium problems. The lectures also explained nonequilibrium statistical physics in a systematic way and contained essential concepts on statistical physics in terms of Green’s functions with sufficient and rigorous details. In-Gee Kim thoroughly studied the lectures during one of his research projects but found that the unspecialized method used to present them in the form of a book reduced their readability. He st...

  3. Statistical weighted A-summability with application to Korovkin’s type approximation theorem

    Directory of Open Access Journals (Sweden)

    Syed Abdul Mohiuddine

    2016-03-01

    Full Text Available Abstract We introduce the notion of statistical weighted A-summability of a sequence and establish its relation with weighted A-statistical convergence. We also define weighted regular matrix and obtain necessary and sufficient conditions for the matrix A to be weighted regular. As an application, we prove the Korovkin type approximation theorem through statistical weighted A-summability and using the BBH operator to construct an illustrative example in support of our result.

  4. Enough is as good as a feast - sufficiency as policy

    Energy Technology Data Exchange (ETDEWEB)

    Darby, Sarah [Lower Carbon Futures, Environmental Change Inst., Oxford Univ. Centre for the Environment (United Kingdom)

    2007-07-01

    The concept of sufficiency has a long history, related as it is to the timeless issues of how best to distribute and use resources. Where energy is concerned, absolute reductions in demand are increasingly seen as necessary in response to climate change and energy security concerns. There is an acknowledgement that, collectively if not individually, humans have gone beyond safe limits in their use of fuels. The relatively wealthy and industrialised nations urgently need to move beyond a primary focus on efficiency to the more contentious issues surrounding demand reduction and sufficiency. The paper considers definitions of energy sufficiency, looks at a recent attempt to model future energy use in terms of efficiency and sufficiency, and discusses quantitative and qualitative aspects of sufficiency and how they might become institutionalised. There are many arguments in favour of sufficiency but they often founder in the face of political requirements for market growth and the employment generated by it. Some options for 'sufficiency policy' are selected, including a focus on energy in relation to livelihoods, energy implications of our use of time and making energy use more transparent.

  5. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    Science.gov (United States)

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  6. Exploring Societal Preferences for Energy Sufficiency Measures in Switzerland

    International Nuclear Information System (INIS)

    Moser, Corinne; Rösch, Andreas; Stauffacher, Michael

    2015-01-01

    Many countries are facing a challenging transition toward more sustainable energy systems, which produce more renewables and consume less energy. The latter goal can only be achieved through a combination of efficiency measures and changes in people’s lifestyles and routine behaviors (i.e., sufficiency). While research has shown that acceptance of technical efficiency is relatively high, there is a lack of research on societal preferences for sufficiency measures. However, this is an important prerequisite for designing successful interventions to change behavior. This paper analyses societal preferences for different energy-related behaviors in Switzerland. We use an online choice-based conjoint analysis (N = 150) to examine preferences for behaviors with high technical potentials for energy demand reduction in the following domains: mobility, heating, and food. Each domain comprises different attributes across three levels of sufficiency. Respondents were confronted with trade-off situations evoked through different fictional lifestyles that comprised different combinations of attribute levels. Through a series of trade-off decisions, participants were asked to choose their preferred lifestyle. The results revealed that a vegetarian diet was considered the most critical issue that respondents were unwilling to trade off, followed by distance to workplace and means of transportation. The highest willingness to trade off was found for adjustments in room temperature, holiday travel behaviors, and living space. Participants’ preferences for the most energy-sufficient lifestyles were rather low. However, the study showed that there were lifestyles with substantive energy-saving potentials that were well accepted among respondents. Our study results suggest that the success of energy-sufficiency interventions might depend strongly on the targeted behavior. We speculate that they may face strong resistance (e.g., vegetarian diet). Thus, it seems promising to

  7. Exploring Societal Preferences for Energy Sufficiency Measures in Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Moser, Corinne, E-mail: corinne.moser@zhaw.ch [Institute of Sustainable Development, School of Engineering, Zurich University of Applied Sciences, Winterthur (Switzerland); Natural and Social Science Interface, Institute for Environmental Decisions, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland); Rösch, Andreas [Natural and Social Science Interface, Institute for Environmental Decisions, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland); Stauffacher, Michael [Natural and Social Science Interface, Institute for Environmental Decisions, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland); Transdisciplinarity Laboratory, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland)

    2015-09-16

    Many countries are facing a challenging transition toward more sustainable energy systems, which produce more renewables and consume less energy. The latter goal can only be achieved through a combination of efficiency measures and changes in people’s lifestyles and routine behaviors (i.e., sufficiency). While research has shown that acceptance of technical efficiency is relatively high, there is a lack of research on societal preferences for sufficiency measures. However, this is an important prerequisite for designing successful interventions to change behavior. This paper analyses societal preferences for different energy-related behaviors in Switzerland. We use an online choice-based conjoint analysis (N = 150) to examine preferences for behaviors with high technical potentials for energy demand reduction in the following domains: mobility, heating, and food. Each domain comprises different attributes across three levels of sufficiency. Respondents were confronted with trade-off situations evoked through different fictional lifestyles that comprised different combinations of attribute levels. Through a series of trade-off decisions, participants were asked to choose their preferred lifestyle. The results revealed that a vegetarian diet was considered the most critical issue that respondents were unwilling to trade off, followed by distance to workplace and means of transportation. The highest willingness to trade off was found for adjustments in room temperature, holiday travel behaviors, and living space. Participants’ preferences for the most energy-sufficient lifestyles were rather low. However, the study showed that there were lifestyles with substantive energy-saving potentials that were well accepted among respondents. Our study results suggest that the success of energy-sufficiency interventions might depend strongly on the targeted behavior. We speculate that they may face strong resistance (e.g., vegetarian diet). Thus, it seems promising to

  8. Statistical density of nuclear excited states

    Directory of Open Access Journals (Sweden)

    V. M. Kolomietz

    2015-10-01

    Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.

  9. Why Current Statistics of Complementary Alternative Medicine Clinical Trials is Invalid.

    Science.gov (United States)

    Pandolfi, Maurizio; Carreras, Giulia

    2018-06-07

    It is not sufficiently known that frequentist statistics cannot provide direct information on the probability that the research hypothesis tested is correct. The error resulting from this misunderstanding is compounded when the hypotheses under scrutiny have precarious scientific bases, which, generally, those of complementary alternative medicine (CAM) are. In such cases, it is mandatory to use inferential statistics, considering the prior probability that the hypothesis tested is true, such as the Bayesian statistics. The authors show that, under such circumstances, no real statistical significance can be achieved in CAM clinical trials. In this respect, CAM trials involving human material are also hardly defensible from an ethical viewpoint.

  10. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  11. Rural Villagers’ Quality of Life Improvement by Economic Self-Reliance Practices and Trust in the Philosophy of Sufficiency Economy

    Directory of Open Access Journals (Sweden)

    Piyapong Janmaimool

    2016-08-01

    Full Text Available The concept of economic self-reliance, widely known by Thai people as the philosophy of sufficiency economy, has been widely promoted in rural Thai societies. By practicing this philosophy, it is expected that the citizens’ quality of life and local environments could be sustainably improved. This study aims to explore the contribution of the community practices of the sufficiency economy philosophy to rural villagers’ quality of life improvement, and to investigate potential factors that determine the trust of villagers in the philosophy. With the purpose to propose strategies which could enhance trust and promote villagers’ practices of the philosophy, the study investigated influences of three relevant factors on trust towards the philosophy. Those factors included factors related to cognitive-based trust, factors related to emotional-based trust, and factors related to demographic characteristics. Questionnaire surveys and in-depth interviews with community leaders and local villagers were conducted in the Ban Jamrung community, in Thailand’s Rayong Province. The results of the statistical analysis revealed that the residents who applied the sufficiency economy philosophy in their daily lives experienced a relatively better quality of life. Additionally, it was found that trust in the philosophy could be predicted more by rational factors than by emotional factors. These findings could be utilized to develop strategies to maintain and enhance the trust of the people in the philosophy of sufficiency economy.

  12. Exclusive observables from a statistical simulation of energetic nuclear collisions

    International Nuclear Information System (INIS)

    Fai, G.

    1983-01-01

    Exclusive observables are calculated in the framework of a statistical model for medium-energy nuclear collisions. The collision system is divided into a few (participant/spectator) sources, that are assumed to disassemble independently. Sufficiently excited sources explode into pions, nucleons, and composite, possibly particle unstable, nuclei. The different final states compete according to their microcanonical weight. Less excited sources, and the unstable explosion products, deexcite via light-particle evaporation. The model has been implemented as a Monte Carlo computer code that is sufficiently efficient to permit generation of large event samples. Some illustrative applications are discussed. (author)

  13. View Discovery in OLAP Databases through Statistical Combinatorial Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Burke, Edward J.; Critchlow, Terence J.

    2009-05-01

    The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of ``views'' of an OLAP database as a combinatorial object of all projections and subsets, and ``view discovery'' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline ``hop-chaining'' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a ``spiraling'' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.

  14. Football goal distributions and extremal statistics

    Science.gov (United States)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  15. Improving the Perception of Self-Sufficiency towards Creative Drama

    Science.gov (United States)

    Pekdogan, Serpil; Korkmaz, Halil Ibrahim

    2016-01-01

    The purpose of this study is to investigate the effects of a Creative Drama Based Perception of Self-sufficiency Skills Training Program on 2nd grade bachelor degree students' (who are attending a preschool teacher training program) perception of self-sufficiency. This is a quasi-experimental study. Totally 50 students were equally divided into…

  16. Statistical analysis of dynamic parameters of the core

    International Nuclear Information System (INIS)

    Ionov, V.S.

    2007-01-01

    The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)

  17. Reassessing Rogers' necessary and sufficient conditions of change.

    Science.gov (United States)

    Watson, Jeanne C

    2007-09-01

    This article reviews the impact of Carl Rogers' postulate about the necessary and sufficient conditions of therapeutic change on the field of psychotherapy. It is proposed that his article (see record 2007-14630-002) made an impact in two ways; first, by acting as a spur to researchers to identify the active ingredients of therapeutic change; and, second, by providing guidelines for therapeutic practice. The role of the necessary and sufficient conditions in process-experiential therapy, an emotion-focused therapy for individuals, and their limitations in terms of research and practice are discussed. It is proposed that although the conditions are necessary and important in promoting clients' affect regulation, they do not take sufficient account of other moderating variables that affect clients' response to treatment and may need to be balanced with more structured interventions. Notwithstanding, Rogers highlighted a way of interacting with clients that is generally acknowledged as essential to effective psychotherapy practice. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  18. Deuterium-tritium fuel self-sufficiency in fusion reactors

    International Nuclear Information System (INIS)

    Abdou, M.A.; Vold, E.L.; Gung, C.Y.; Youssef, M.Z.; Shin, K.

    1986-01-01

    Conditions necessary to achieve deuterium-tritium fuel self-sufficiency in fusion reactors are derived through extensive modeling and calculations of the required and achievable tritium breeding ratios as functions of the many reactor parameters and candidate design concepts. It is found that the excess margin in the breeding potential is not sufficient to cover all present uncertainties. Thus, the goal of attaining fuel self-sufficiency significantly restricts the allowable parameter space and design concepts. For example, the required breeding ratio can be reduced by (A) attaining high tritium fractional burnup, >5%, in the plasma, (B) achieving very high reliability, >99%, and very short times, <1 day, to fix failures in the tritium processing system, and (C) ensuring that nonradioactive decay losses from all subsystems are extremely low, e.g., <0.1% for the plasma exhaust processing system. The uncertainties due to nuclear data and calculational methods are found to be significant, but they are substantially smaller than those due to uncertainties in system definition

  19. Long-Term Resource Adequacy, Long-Term Flexibility Requirements, and Revenue Sufficiency

    Energy Technology Data Exchange (ETDEWEB)

    Milligan, Michael [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bloom, Aaron P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ela, Erik [Electric Power Research Institute; Botterud, Audun [Argonne National Laboratory; Levin, Todd [Argonne National Laboratory

    2018-02-15

    Variable generation (VG) can reduce market prices over time and also the energy that other suppliers can sell in the market. The suppliers that are needed to provide capacity and flexibility to meet the long-term reliability requirements may, therefore, earn less revenue. This chapter discusses the topics of resource adequacy and revenue sufficiency - that is, determining and acquiring the quantity of capacity that will be needed at some future date and ensuring that those suppliers that offer the capacity receive sufficient revenue to recover their costs. The focus is on the investment time horizon and the installation of sufficient generation capability. First, the chapter discusses resource adequacy, including newer methods of determining adequacy metrics. The chapter then focuses on revenue sufficiency and how suppliers have sufficient opportunity to recover their total costs. The chapter closes with a description of the mechanisms traditionally adopted by electricity markets to mitigate the issues of resource adequacy and revenue sufficiency and discusses the most recent market design changes to address these issues.

  20. Sufficient conditions for Lagrange, Mayer, and Bolza optimization problems

    Directory of Open Access Journals (Sweden)

    Boltyanski V.

    2001-01-01

    Full Text Available The Maximum Principle [2,13] is a well known necessary condition for optimality. This condition, generally, is not sufficient. In [3], the author proved that if there exists regular synthesis of trajectories, the Maximum Principle also is a sufficient condition for time-optimality. In this article, we generalize this result for Lagrange, Mayer, and Bolza optimization problems.

  1. A novel approach for choosing summary statistics in approximate Bayesian computation.

    Science.gov (United States)

    Aeschbacher, Simon; Beaumont, Mark A; Futschik, Andreas

    2012-11-01

    The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θ(anc) = 4N(e)u) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L(2)-loss performs best. Applying that method to the ibex data, we estimate θ(anc)≈ 1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10(-4) and 3.5 × 10(-3) per locus per generation. The proportion of males with access to matings is estimated as ω≈ 0.21, which is in good agreement with recent independent estimates.

  2. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  3. Transforming the energy system: Why municipalities strive for energy self-sufficiency

    International Nuclear Information System (INIS)

    Engelken, Maximilian; Römer, Benedikt; Drescher, Marcus; Welpe, Isabell

    2016-01-01

    Despite evidence that a rising number of municipalities in Germany are striving for energy self-sufficiency, there is little understanding of the driving factors behind this development. We investigate economic, ecological, social and energy system related factors that drive municipalities to strive for energy self-sufficiency with a focus on electricity supply. The empirical data for this study is based on insights generated through expert interviews (N =19) with mayors, energy experts and scientists as well as a quantitative study among mayors and energy officers (N =109) of German municipalities. Results show that environmental awareness, tax revenues and greater independence from private utilities are positively related to the mayors’ attitude towards the realization of energy self-sufficiency. Furthermore, citizens, the political environment, the mayor's political power, and his/her financial resources are relevant factors for a municipality striving for energy self-sufficiency. Policymakers need to decide whether or not to support mayors in this development. For suitable policy interventions, the results suggest the importance of an integrated approach that considers a combination of identified factors. Finally, we propose a morphological box to structure different aspects of energy self-sufficiency and categorize the present study. - Highlights: • Municipalities striving for energy self-sufficiency can play a key role in the transition of the energy system. • Tax revenues and environmental awareness main drivers behind mayors’ attitude towards energy self-sufficiency. • Citizens and the political environment main influencers of mayors striving for energy self-sufficiency. • 19 expert interviews analyzed for the framework of the study based on the theory of planned behavior (TPB). • 109 mayors and energy officers participated in the quantitative main survey.

  4. Sufficient conditions for uniqueness of the weak value

    International Nuclear Information System (INIS)

    Dressel, J; Jordan, A N

    2012-01-01

    We review and clarify the sufficient conditions for uniquely defining the generalized weak value as the weak limit of a conditioned average using the contextual values formalism introduced in Dressel, Agarwal and Jordan (2010 Phys. Rev. Lett. http://dx.doi.org/10.1103/PhysRevLett.104.240401). We also respond to criticism of our work by Parrott (arXiv:1105.4188v1) concerning a proposed counter-example to the uniqueness of the definition of the generalized weak value. The counter-example does not satisfy our prescription in the case of an underspecified measurement context. We show that when the contextual values formalism is properly applied to this example, a natural interpretation of the measurement emerges and the unique definition in the weak limit holds. We also prove a theorem regarding the uniqueness of the definition under our sufficient conditions for the general case. Finally, a second proposed counter-example by Parrott (arXiv:1105.4188v6) is shown not to satisfy the sufficiency conditions for the provided theorem. (paper)

  5. Increasing urban water self-sufficiency: New era, new challenges

    DEFF Research Database (Denmark)

    Rygaard, Martin; Binning, Philip John; Albrechtsen, Hans-Jørgen

    2011-01-01

    and 15 in-depth case studies, solutions used to increase water self-sufficiency in urban areas are analyzed. The main drivers for increased self-sufficiency were identified to be direct and indirect lack of water, constrained infrastructure, high quality water demands and commercial and institutional...... pressures. Case studies demonstrate increases in self-sufficiency ratios to as much as 80% with contributions from recycled water, seawater desalination and rainwater collection. The introduction of alternative water resources raises several challenges: energy requirements vary by more than a factor of ten...... amongst the alternative techniques, wastewater reclamation can lead to the appearance of trace contaminants in drinking water, and changes to the drinking water system can meet tough resistance from the public. Public water-supply managers aim to achieve a high level of reliability and stability. We...

  6. Denmark. Self-sufficiency and reserves management

    International Nuclear Information System (INIS)

    Erceville, H. d'.

    1997-01-01

    Since 1991, Denmark is a self-sufficient and a net petroleum and natural gas exporting country. Like all neighboring countries of the North sea, this country enjoys many advantages. However, Denmark exports and imports about a third of its hydrocarbons. This policy is a way to control its reserves for the future. (J.S.)

  7. Small-number statistics near the clustering transition in a compartementalized granular gas

    NARCIS (Netherlands)

    Mikkelsen, René; van der Weele, Ko; van der Meer, Devaraj; van Hecke, Martin; Lohse, Detlef

    2005-01-01

    Statistical fluctuations are observed to profoundly influence the clustering behavior of granular material in a vibrated system consisting of two connected compartments. When the number of particles N is sufficiently large sN<300 is sufficientd, the clustering follows the lines of a standard

  8. 33 CFR 115.30 - Sufficiency of State authority for bridges.

    Science.gov (United States)

    2010-07-01

    ... for bridges. 115.30 Section 115.30 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES BRIDGE LOCATIONS AND CLEARANCES; ADMINISTRATIVE PROCEDURES § 115.30 Sufficiency of State authority for bridges. An opinion of the attorney general of the State as to the sufficiency of State...

  9. Enough is as good as a feast - sufficiency as policy. Volume 1

    International Nuclear Information System (INIS)

    Darby, Sarah

    2007-01-01

    The concept of sufficiency has a long history, related as it is to the timeless issues of how best to distribute and use resources. Where energy is concerned, absolute reductions in demand are increasingly seen as necessary in response to climate change and energy security concerns. There is an acknowledgement that, collectively if not individually, humans have gone beyond safe limits in their use of fuels. The relatively wealthy and industrialised nations urgently need to move beyond a primary focus on efficiency to the more contentious issues surrounding demand reduction and sufficiency. The paper considers definitions of energy sufficiency, looks at a recent attempt to model future energy use in terms of efficiency and sufficiency, and discusses quantitative and qualitative aspects of sufficiency and how they might become institutionalised. There are many arguments in favour of sufficiency but they often founder in the face of political requirements for market growth and the employment generated by it. Some options for 'sufficiency policy' are selected, including a focus on energy in relation to livelihoods, energy implications of our use of time and making energy use more transparent

  10. Enough is as good as a feast - sufficiency as policy. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Darby, Sarah [Lower Carbon Futures, Environmental Change Inst., Oxford Univ. Centre for the Environment (United Kingdom)

    2007-07-01

    The concept of sufficiency has a long history, related as it is to the timeless issues of how best to distribute and use resources. Where energy is concerned, absolute reductions in demand are increasingly seen as necessary in response to climate change and energy security concerns. There is an acknowledgement that, collectively if not individually, humans have gone beyond safe limits in their use of fuels. The relatively wealthy and industrialised nations urgently need to move beyond a primary focus on efficiency to the more contentious issues surrounding demand reduction and sufficiency. The paper considers definitions of energy sufficiency, looks at a recent attempt to model future energy use in terms of efficiency and sufficiency, and discusses quantitative and qualitative aspects of sufficiency and how they might become institutionalised. There are many arguments in favour of sufficiency but they often founder in the face of political requirements for market growth and the employment generated by it. Some options for 'sufficiency policy' are selected, including a focus on energy in relation to livelihoods, energy implications of our use of time and making energy use more transparent.

  11. On the statistical assessment of classifiers using DNA microarray data

    Directory of Open Access Journals (Sweden)

    Carella M

    2006-08-01

    Full Text Available Abstract Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22 and tumor (25 specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045 as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS and Support Vector Machines (SVM classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035 and e = 18% (p = 0.037 respectively. Moreover, the error rate

  12. Sufficient Statistics for Divergence and the Probability of Misclassification

    Science.gov (United States)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  13. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  14. The level of energy sufficiency - why all the controversy?

    International Nuclear Information System (INIS)

    Maillard, D.

    2000-01-01

    It as become fashionable in certain circles to attempt to demolish the notion of energy sufficiency, a concept which is now seen as being archaic and unsuitable. To back up their claims, proponents of this standpoint take great pleasure in attacking the corresponding indicator - the rate of energy sufficiency calculated as a ratio of national primary energy production and the total consumption of primary energy (in the same unit and without climatic corrections). Confirming its precarious, conventional and debatable nature seems in their eyes to be the best means of ensuring that both the word, the concept and the measuring method of energy sufficiency are consigned to the dustbin of economic history. After having examined with perhaps a certain irony some of the usual criticisms, I intend to proceed with a re-examination of questions which in my eyes appear to be essential. (author)

  15. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.; Schulz, Karl W.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently

  16. Asymmetric beams and CMB statistical anisotropy

    International Nuclear Information System (INIS)

    Hanson, Duncan; Lewis, Antony; Challinor, Anthony

    2010-01-01

    Beam asymmetries result in statistically anisotropic cosmic microwave background (CMB) maps. Typically, they are studied for their effects on the CMB power spectrum, however they more closely mimic anisotropic effects such as gravitational lensing and primordial power asymmetry. We discuss tools for studying the effects of beam asymmetry on general quadratic estimators of anisotropy, analytically for full-sky observations as well as in the analysis of realistic data. We demonstrate this methodology in application to a recently detected 9σ quadrupolar modulation effect in the WMAP data, showing that beams provide a complete and sufficient explanation for the anomaly.

  17. Error Bounds: Necessary and Sufficient Conditions

    Czech Academy of Sciences Publication Activity Database

    Outrata, Jiří; Kruger, A.Y.; Fabian, Marián; Henrion, R.

    2010-01-01

    Roč. 18, č. 2 (2010), s. 121-149 ISSN 1877-0533 R&D Projects: GA AV ČR IAA100750802 Institutional research plan: CEZ:AV0Z10750506; CEZ:AV0Z10190503 Keywords : Error bounds * Calmness * Subdifferential * Slope Subject RIV: BA - General Mathematics Impact factor: 0.333, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/outrata-error bounds necessary and sufficient conditions.pdf

  18. The Army Ethic-Inchoate but Sufficient

    Science.gov (United States)

    2015-06-12

    are constraints imposed by this thesis. Delimitations include the scope, jus ad bellum, cultural relativism , descriptive ethics , and implementation...politicians. Third, this thesis will not look in depth at cultural relativism and how changes in laws and society’s philosophical and ethical ...THE ARMY ETHIC –INCHOATE BUT SUFFICIENT A thesis presented to the Faculty of the U.S. Army Command and General Staff College

  19. Innovations in Statistical Observations of Consumer Prices

    Directory of Open Access Journals (Sweden)

    Olga Stepanovna Oleynik

    2016-10-01

    Full Text Available This article analyzes the innovative changes in the methodology of statistical surveys of consumer prices. These changes are reflected in the “Official statistical methodology for the organization of statistical observation of consumer prices for goods and services and the calculation of the consumer price index”, approved by order of the Federal State Statistics Service of December 30, 2014 no. 734. The essence of innovation is the use of mathematical methods in determining the range of studies objects of trade and services, in calculating the sufficient observable price quotes based on price dispersion, the proportion of the observed product (service, a representative of consumer spending, as well as the indicator of the complexity of price registration. The authors analyzed the mathematical calculations of the required number of quotations for observation in the Volgograd region in 2016, the results of calculations are compared with the number of quotes included in the monitoring. The authors believe that the implementation of these mathematical models allowed to substantially reduce the influence of the subjective factor in the organization of monitoring of consumer prices, and therefore to increase the objectivity of the resulting statistics on consumer prices and inflation. At the same time, the proposed methodology needs further improvement in terms of payment for goods, products (services by representatives having a minor share in consumer expenditure.

  20. Solvability of a class of systems of infinite-dimensional integral equations and their application in statistical mechanics

    International Nuclear Information System (INIS)

    Gonchar, N.S.

    1986-01-01

    This paper presents a mathematical method developed for investigating a class of systems of infinite-dimensional integral equations which have application in statistical mechanics. Necessary and sufficient conditions are obtained for the uniqueness and bifurcation of the solution of this class of systems of equations. Problems of equilibrium statistical mechanics are considered on the basis of this method

  1. Statistical learning in a natural language by 8-month-old infants.

    Science.gov (United States)

    Pelucchi, Bruna; Hay, Jessica F; Saffran, Jenny R

    2009-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants' ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition.

  2. Whose statistical reasoning is facilitated by a causal structure intervention?

    Science.gov (United States)

    McNair, Simon; Feeney, Aidan

    2015-02-01

    People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.

  3. Sufficient Condition for Estimation in Designing H∞ Filter-Based SLAM

    Directory of Open Access Journals (Sweden)

    Nur Aqilah Othman

    2015-01-01

    Full Text Available Extended Kalman filter (EKF is often employed in determining the position of mobile robot and landmarks in simultaneous localization and mapping (SLAM. Nonetheless, there are some disadvantages of using EKF, namely, the requirement of Gaussian distribution for the state and noises, as well as the fact that it requires the smallest possible initial state covariance. This has led researchers to find alternative ways to mitigate the aforementioned shortcomings. Therefore, this study is conducted to propose an alternative technique by implementing H∞ filter in SLAM instead of EKF. In implementing H∞ filter in SLAM, the parameters of the filter especially γ need to be properly defined to prevent finite escape time problem. Hence, this study proposes a sufficient condition for the estimation purposes. Two distinct cases of initial state covariance are analysed considering an indoor environment to ensure the best solution for SLAM problem exists along with considerations of process and measurement noises statistical behaviour. If the prescribed conditions are not satisfied, then the estimation would exhibit unbounded uncertainties and consequently results in erroneous inference about the robot and landmarks estimation. The simulation results have shown the reliability and consistency as suggested by the theoretical analysis and our previous findings.

  4. Statistical inference an integrated approach

    CERN Document Server

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  5. The bag-of-frames approach: A not so sufficient model for urban soundscapes

    Science.gov (United States)

    Lagrange, Mathieu; Lafay, Grégoire; Défréville, Boris; Aucouturier, Jean-Julien

    2015-11-01

    The "bag-of-frames" approach (BOF), which encodes audio signals as the long-term statistical distribution of short-term spectral features, is commonly regarded as an effective and sufficient way to represent environmental sound recordings (soundscapes) since its introduction in an influential 2007 article. The present paper describes a concep-tual replication of this seminal article using several new soundscape datasets, with results strongly questioning the adequacy of the BOF approach for the task. We show that the good accuracy originally re-ported with BOF likely result from a particularly thankful dataset with low within-class variability, and that for more realistic datasets, BOF in fact does not perform significantly better than a mere one-point av-erage of the signal's features. Soundscape modeling, therefore, may not be the closed case it was once thought to be. Progress, we ar-gue, could lie in reconsidering the problem of considering individual acoustical events within each soundscape.

  6. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    Science.gov (United States)

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural

  7. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics.

    Science.gov (United States)

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural

  8. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    Directory of Open Access Journals (Sweden)

    Manuela Paechter

    2017-07-01

    Full Text Available In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men. Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in

  9. New Sufficient LMI Conditions for Static Output Stabilization

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher

    2014-01-01

    This paper presents new linear matrix inequality conditions to the static output feedback stabilization problem. Although the conditions are only sufficient, numerical experiments show excellent success rates in finding a stabilizing controller....

  10. Implementing necessary and sufficient standards for radioactive waste management at LLNL

    International Nuclear Information System (INIS)

    Sims, J.M.; Ladran, A.; Hoyt, D.

    1995-01-01

    Lawrence Livermore National Laboratory (LLNL) and the U.S. Department of Energy, Oakland Field Office (DOE/OAK), are participating in a pilot program to evaluate the process to develop necessary and sufficient sets of standards for contractor activities. This concept of contractor and DOE jointly and locally deciding on what constitutes the set of standards that are necessary and sufficient to perform work safely and in compliance with federal, state, and local regulations, grew out of DOE's Department Standards Committee (Criteria for the Department's Standards Program, August 1994, DOE/EH/-0416). We have chosen radioactive waste management activities as the pilot program at LLNL. This pilot includes low-level radioactive waste, transuranic (TRU) waste, and the radioactive component of low-level and TRU mixed wastes. Guidance for the development and implementation of the necessary and sufficient set of standards is provided in open-quotes The Department of Energy Closure Process for Necessary and Sufficient Sets of Standards,close quotes March 27, 1995 (draft)

  11. Mathematical statistics essays on history and methodology

    CERN Document Server

    Pfanzagl, Johann

    2017-01-01

    This book presents a detailed description of the development of statistical theory. In the mid twentieth century, the development of mathematical statistics underwent an enduring change, due to the advent of more refined mathematical tools. New concepts like sufficiency, superefficiency, adaptivity etc. motivated scholars to reflect upon the interpretation of mathematical concepts in terms of their real-world relevance. Questions concerning the optimality of estimators, for instance, had remained unanswered for decades, because a meaningful concept of optimality (based on the regularity of the estimators, the representation of their limit distribution and assertions about their concentration by means of Anderson’s Theorem) was not yet available. The rapidly developing asymptotic theory provided approximate answers to questions for which non-asymptotic theory had found no satisfying solutions. In four engaging essays, this book presents a detailed description of how the use of mathematical methods stimulated...

  12. Sufficient conditions for positivity of non-Markovian master equations with Hermitian generators

    International Nuclear Information System (INIS)

    Wilkie, Joshua; Wong Yinmei

    2009-01-01

    We use basic physical motivations to develop sufficient conditions for positive semidefiniteness of the reduced density matrix for generalized non-Markovian integrodifferential Lindblad-Kossakowski master equations with Hermitian generators. We show that it is sufficient for the memory function to be the Fourier transform of a real positive symmetric frequency density function with certain properties. These requirements are physically motivated, and are more general and more easily checked than previously stated sufficient conditions. We also explore the decoherence dynamics numerically for some simple models using the Hadamard representation of the propagator. We show that the sufficient conditions are not necessary conditions. We also show that models exist in which the long time limit is in part determined by non-Markovian effects

  13. View discovery in OLAP databases through statistical combinatorial optimization

    Energy Technology Data Exchange (ETDEWEB)

    Hengartner, Nick W [Los Alamos National Laboratory; Burke, John [PNNL; Critchlow, Terence [PNNL; Joslyn, Cliff [PNNL; Hogan, Emilie [PNNL

    2009-01-01

    OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.

  14. Sufficient vitamin K status combined with sufficient vitamin D status is associated with better lower extremity function: a prospective analysis of two knee osteoarthritis cohorts.

    Science.gov (United States)

    Shea, M Kyla; Loeser, Richard F; McAlindon, Timothy E; Houston, Denise K; Kritchevsky, Stephen B; Booth, Sarah L

    2017-10-17

    Vitamins K and D are important for the function of vitamin K-dependent proteins in joint tissues. It is unclear if these nutrients are mutually important to functional outcomes related to knee osteoarthritis (OA). We evaluated the association of vitamin K and D sufficiency with lower-extremity function in the Health, Aging Body Composition Knee OA Sub-study (Health ABC) and conducted a replication analysis in an independent cohort, the Osteoarthritis Initiative (OAI). In Health ABC (60% female, 75±3 years) baseline nutrient status was measured using circulating vitamin K and 25(OH)D. Lower-extremity function was assessed using the short physical performance battery (SPPB) and usual 20-meter gait speed. In the OAI (58% female, 61±9 years), baseline nutrient intake was estimated by food frequency questionnaire. Lower-extremity function was assessed using usual 20-meter gait speed and chair stand completion time. Multivariate mixed models were used to evaluate the association of vitamin K and D status and intake with lower-extremity function over 4-5 years. Health ABC participants with sufficient plasma vitamin K (≥1.0 nmol/L) and serum 25(OH)D (≥50 nmol/L) generally had better SPPB scores and faster usual gait speed over follow-up (p≤0.002). In the OAI, sufficient vitamin K and vitamin D intake combined was associated with overall faster usual gait speed and chair stand completion time over follow-up (p≤0.029). Sufficient vitamin K status combined with sufficient vitamin D status was associated with better lower-extremity function in two knee OA cohorts. These findings merit confirmation in vitamin K and D co-supplementation trials. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Global Sufficient Optimality Conditions for a Special Cubic Minimization Problem

    Directory of Open Access Journals (Sweden)

    Xiaomei Zhang

    2012-01-01

    Full Text Available We present some sufficient global optimality conditions for a special cubic minimization problem with box constraints or binary constraints by extending the global subdifferential approach proposed by V. Jeyakumar et al. (2006. The present conditions generalize the results developed in the work of V. Jeyakumar et al. where a quadratic minimization problem with box constraints or binary constraints was considered. In addition, a special diagonal matrix is constructed, which is used to provide a convenient method for justifying the proposed sufficient conditions. Then, the reformulation of the sufficient conditions follows. It is worth noting that this reformulation is also applicable to the quadratic minimization problem with box or binary constraints considered in the works of V. Jeyakumar et al. (2006 and Y. Wang et al. (2010. Finally some examples demonstrate that our optimality conditions can effectively be used for identifying global minimizers of the certain nonconvex cubic minimization problem.

  16. Do sufficient vitamin D levels at the end of summer in children and adolescents provide an assurance of vitamin D sufficiency at the end of winter? A cohort study.

    Science.gov (United States)

    Shakeri, Habibesadat; Pournaghi, Seyed-Javad; Hashemi, Javad; Mohammad-Zadeh, Mohammad; Akaberi, Arash

    2017-10-26

    The changes in serum 25-hydroxyvitamin D (25(OH)D) in adolescents from summer to winter and optimal serum vitamin D levels in the summer to ensure adequate vitamin D levels at the end of winter are currently unknown. This study was conducted to address this knowledge gap. The study was conducted as a cohort study. Sixty-eight participants aged 7-18 years and who had sufficient vitamin D levels at the end of the summer in 2011 were selected using stratified random sampling. Subsequently, the participants' vitamin D levels were measured at the end of the winter in 2012. A receiver operating characteristic (ROC) curve was used to determine optimal cutoff points for vitamin D at the end of the summer to predict sufficient vitamin D levels at the end of the winter. The results indicated that 89.7% of all the participants had a decrease in vitamin D levels from summer to winter: 14.7% of them were vitamin D-deficient, 36.8% had insufficient vitamin D concentrations and only 48.5% where able to maintain sufficient vitamin D. The optimal cutoff point to provide assurance of sufficient serum vitamin D at the end of the winter was 40 ng/mL at the end of the summer. Sex, age and vitamin D levels at the end of the summer were significant predictors of non-sufficient vitamin D at the end of the winter. In this age group, a dramatic reduction in vitamin D was observed over the follow-up period. Sufficient vitamin D at the end of the summer did not guarantee vitamin D sufficiency at the end of the winter. We found 40 ng/mL as an optimal cutoff point.

  17. POLITICAL ECONOMIC ANALYSIS OF RICE SELF-SUFFICIENCY IN INDONESIA

    Directory of Open Access Journals (Sweden)

    Sri Nuryanti

    2018-01-01

    Full Text Available Rice self-sufficiency is an important programme in Indonesia. The programme has four major targets, i.e. increasing production, stabilizing prices and reserve stocks, and minimizing import. For that purpose, the government gave a mandate to a parastatal, namely National Logistic Agency (Bulog in implementing the rice policies. Some studies found that involvement of such a parastatal could lead to government failure in budget allocation. The study aimed to estimate social cost of rice self-sufficiency programme based on the implementation of rice instrument policies by Bulog. The study used the national annual data of 2002–2014 period. The method used was the political preference function model to estimate economic rent and dead-weight loss using rice price elasticity of demand and supply. The result showed that in terms of percentage of food security budget, the average of economic rent reached IDR 6.37 trillion per annum (18.54%, while the average of dead-weight loss amounted at IDR 0.90 trillion per annum (2.34%. It proved that rice self-sufficiency programme along with the involvement of Bulog was economically inefficient. The government should provide better agricultural infrastructure, review governmental procurement prices, and stop rice import policy to remedy market failure.

  18. Pricing and crude oil self-sufficiency. [Canada

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-01

    How Canada should go about achieving crude oil self-sufficiency and who should develop Canada's petroleum resources are discussed. The degree of urgency and the level of commitment required by government, industry, and consumers are evaluated. What the price should be of Canadian crude oil and who should establish this price are also discussed. The economic aspects of investment, return, and taxation are also included. (DC)

  19. New applications of statistical tools in plant pathology.

    Science.gov (United States)

    Garrett, K A; Madden, L V; Hughes, G; Pfender, W F

    2004-09-01

    ABSTRACT The series of papers introduced by this one address a range of statistical applications in plant pathology, including survival analysis, nonparametric analysis of disease associations, multivariate analyses, neural networks, meta-analysis, and Bayesian statistics. Here we present an overview of additional applications of statistics in plant pathology. An analysis of variance based on the assumption of normally distributed responses with equal variances has been a standard approach in biology for decades. Advances in statistical theory and computation now make it convenient to appropriately deal with discrete responses using generalized linear models, with adjustments for overdispersion as needed. New nonparametric approaches are available for analysis of ordinal data such as disease ratings. Many experiments require the use of models with fixed and random effects for data analysis. New or expanded computing packages, such as SAS PROC MIXED, coupled with extensive advances in statistical theory, allow for appropriate analyses of normally distributed data using linear mixed models, and discrete data with generalized linear mixed models. Decision theory offers a framework in plant pathology for contexts such as the decision about whether to apply or withhold a treatment. Model selection can be performed using Akaike's information criterion. Plant pathologists studying pathogens at the population level have traditionally been the main consumers of statistical approaches in plant pathology, but new technologies such as microarrays supply estimates of gene expression for thousands of genes simultaneously and present challenges for statistical analysis. Applications to the study of the landscape of the field and of the genome share the risk of pseudoreplication, the problem of determining the appropriate scale of the experimental unit and of obtaining sufficient replication at that scale.

  20. Statistics and Informatics in Space Astrophysics

    Science.gov (United States)

    Feigelson, E.

    2017-12-01

    The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.

  1. [Vitamin-antioxidant sufficiency of winter sports athletes].

    Science.gov (United States)

    Beketova, N A; Kosheleva, O V; Pereverzeva, O G; Vrzhesinskaia, O A; Kodentsova, V M; Solntseva, T N; Khanfer'ian, R A

    2013-01-01

    The sufficiency of 169 athletes (six disciplines: bullet shooting, biathlon, bobsleigh, skeleton, freestyle skiing, snowboarding) with vitamins A, E, C, B2, and beta-carotene has been investigated in April-September 2013. All athletes (102 juniors, mean age--18.5 +/- 0.3 years, and 67 adult high-performance athletes, mean age--26.8 +/- 0.7 years) were sufficiently supplied with vitamin A (70.7 +/- 1.7 mcg/dl). Mean blood serum retinol level was 15% higher the upper limit of the norm (80 mcg/dl) in biathletes while median reached 90.9 mcg/dl. Blood serum level of tocopherols (1.22 +/- 0.03 mg/dl), ascorbic acid (1.06 +/- 0.03 mg/dl), riboflavin (7.1 +/- 0.4 ng/ml), and beta-carotene (25.1 +/- 1.7 mcg/dl) was in within normal range, but the incidence of insufficiency of vitamins E, C, B2, and carotenoid among athletes varied in the range of 0-25, 0-17, 15-67 and 42-75%, respectively. 95% of adults and 80% of younger athletes were sufficiently provided with vitamin E. Vitamin E level in blood serum of juniors involved in skeleton and biathlon was lower by 51 and 72% (p antioxidants (beta-carotene and vitamins E and C). In other sports, the relative quantity of athletes sufficiently supplied with these essential nutrients did not exceed 56%. The quota of supplied with all antioxidants among bullet shooters (31.1%) and bobsledders (23.5%) was significantly (p antioxidant (mainly beta-carotene) was most often recorded among persons engaged in bullet shooting (67%). The simultaneous lack of all three antioxidants was found only in freestylers and bobsledders (about 5%). Decreased level of antioxidants in blood serum in 40% of athletes was combined with vitamin B2 deficiency. The data obtained suggest the necessity to optimize diet vitamin content of all athletes, taking into account the age and gender differences. Contrary to prevailing stereotypes the optimization must involve not only an increase in the consumption of vitamins (vitamins E, B group) and carotenoids, but

  2. Incorporation of systematic uncertainties in statistical decision rules

    International Nuclear Information System (INIS)

    Wichers, V.A.

    1994-02-01

    The influence of systematic uncertainties on statistical hypothesis testing is an underexposed subject. Systematic uncertainties cannot be incorporated in hypothesis tests, but they deteriorate the performance of these tests. A wrong treatment of systematic uncertainties in verification applications in safeguards leads to false assessment of the strength of the safeguards measure, and thus undermines the safeguards system. The effects of systematic uncertainties on decision errors in hypothesis testing are analyzed quantitatively for an example from the safeguards practice. (LEU-HEU verification of UF 6 enrichment in centrifuge enrichment plants). It is found that the only proper way to tackle systematic uncertainties is reduction to sufficiently low levels; criteria for these are proposed. Although conclusions were obtained from study of a single practical application, it is believed that they hold generally: for all sources of systematic uncertainties, all statistical decision rules, and all applications. (orig./HP)

  3. Statistical mechanics of lattice Boson field theory

    International Nuclear Information System (INIS)

    1976-01-01

    A lattice approximation to Euclidean, boson quantum field theory is expressed in terms of the thermodynamic properties of a classical statistical mechanical system near its critical point in a sufficiently general way to permit the inclusion of an anomalous dimension of the vacuum. Using the thermodynamic properties of the Ising model, one can begin to construct nontrivial (containing scattering) field theories in 2, 3 and 4 dimensions. It is argued that, depending on the choice of the bare coupling constant, there are three types of behavior to be expected: the perturbation theory region, the renormalization group fixed point region, and the Ising model region

  4. Statistical mechanics of lattice boson field theory

    International Nuclear Information System (INIS)

    Baker, G.A. Jr.

    1977-01-01

    A lattice approximation to Euclidean, boson quantum field theory is expressed in terms of the thermodynamic properties of a classical statistical mechanical system near its critical point in a sufficiently general way to permit the inclusion of an anomalous dimension of the vacuum. Using the thermodynamic properties of the Ising model, one can begin to construct nontrivial (containing scattering) field theories in 2, 3, and 4 dimensions. It is argued that, depending on the choice of the bare coupling constant, there are three types of behavior to be expected: the perturbation theory region, the renormalization group fixed point region, and the Ising model region. 24 references

  5. Energy self-sufficiency in Northampton, Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    The study is not an engineering analysis but begins the process of exploring the potential for conservation and local renewable-resource development in a specific community, Northampton, Massachusetts, with the social, institutional, and environmental factors in that community taken into account. Section I is an extensive executive summary of the full study, and Section II is a detailed examination of the potential for increased local energy self-sufficiency in Northampton, including current and future demand estimates, the possible role of conservation and renewable resources, and a discussion of the economic and social implications of alternative energy systems. (MOW)

  6. Energy Self-Sufficient Island

    International Nuclear Information System (INIS)

    Bratic, S.; Krajacic, G.; Duic, N.; Cotar, A.; Jardas, D.

    2011-01-01

    In order to analyze energy self-sufficient island, example of a smaller island, connected to the power system of a bigger island with an undersea cable, was taken. Mounting substation 10/0,4 is situated on the island and for the moment it provides enough electricity using the medium voltage line. It is assumed that the island is situated on the north part of the Adriatic Sea. The most important problem that occurs on the island is the population drop that occurs for a significant number of years, therefore, life standard needs to be improved, and economic development needs to be encouraged immediately. Local authorities to stimulate sustainable development on the island through different projects, to breath in a new life to the island, open new jobs and attract new people to come live there. Because of the planned development and increase of the population, energy projects, planned as a support to sustainable development, and later achievement of the energy self-sufficiency, is described in this paper. Therefore, Rewisland methodology appliance is described taking into the account three possible scenarios of energy development. Each scenario is calculated until year 2030. Also, what is taken into the account is 100% usage of renewable sources of energy in 2030. Scenario PTV, PP, EE - This scenario includes installation of solar photovoltaic modules and solar thermal collectors on the buildings roofs, as well as well as implementation of energy efficiency on the island (replacement of the street light bulbs with LED lightning, replacement of the old windows and doors on the houses, as well as the installation of the thermal insulation). Scenario PV island - This scenario, similarly to the previous one, includes installation of solar photovoltaic modules and solar thermal collectors an the residential buildings, as well as the 2 MW photovoltaic power plant and ''Green Hotel'', a building that satisfies all of its energy needs completely from renewable energy sources

  7. Intellectual Freedom and Economic Sufficiency as Educational Entitlements.

    Science.gov (United States)

    Morse, Jane Fowler

    2001-01-01

    Using the theories of John Stuart Mill and Karl Marx, this article supports the educational entitlements of intellectual freedom and economic sufficiency. Explores these issues in reference to their implications for teaching, the teaching profession and its training. Concludes that ideas cannot be controlled by the interests of the dominant class.…

  8. Statistical inference involving binomial and negative binomial parameters.

    Science.gov (United States)

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2009-05-01

    Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.

  9. Redox self-sufficient whole cell biotransformation for amination of alcohols.

    Science.gov (United States)

    Klatte, Stephanie; Wendisch, Volker F

    2014-10-15

    Whole cell biotransformation is an upcoming tool to replace common chemical routes for functionalization and modification of desired molecules. In the approach presented here the production of various non-natural (di)amines was realized using the designed whole cell biocatalyst Escherichia coli W3110/pTrc99A-ald-adh-ta with plasmid-borne overexpression of genes for an l-alanine dehydrogenase, an alcohol dehydrogenase and a transaminase. Cascading alcohol oxidation with l-alanine dependent transamination and l-alanine dehydrogenase allowed for redox self-sufficient conversion of alcohols to the corresponding amines. The supplementation of the corresponding (di)alcohol precursors as well as amino group donor l-alanine and ammonium chloride were sufficient for amination and redox cofactor recycling in a resting buffer system. The addition of the transaminase cofactor pyridoxal-phosphate and the alcohol dehydrogenase cofactor NAD(+) was not necessary to obtain complete conversion. Secondary and cyclic alcohols, for example, 2-hexanol and cyclohexanol were not aminated. However, efficient redox self-sufficient amination of aliphatic and aromatic (di)alcohols in vivo was achieved with 1-hexanol, 1,10-decanediol and benzylalcohol being aminated best. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. CANDU: Meeting the demand for energy self-sufficiency

    International Nuclear Information System (INIS)

    Lawson, D.S.

    1985-01-01

    The success of the CANDU program can been seen quickly by examining the comparison of typical electricity bills in various provinces of Canada. The provinces of Quebec and Manitoba benefit b extensive hydro electric schemes, many of which were constructed years ago at low capital cost. In Ontario, the economic growth has outstripped these low cost sources of hydro power and hence the province has to rely on thermal sources of electricity generation. The success of the CANDU program is shown by the fact that it can contribute over a third of electricity in Ontario while keeping the total electricity rate comparable with that of those provinces that can rely on low cost hydro sources. Energy self-sufficiency encompasses a spectrum of requirements. One consideration would be the reliable supply and control of fuel during the operating life of a power plant: A greater degree of self-sufficiency would be obtained by having an involvement in the building and engineering of the power plant prior to its operation

  11. Vaccine procurement and self-sufficiency in developing countries.

    Science.gov (United States)

    Woodle, D

    2000-06-01

    This paper discusses the movement toward self-sufficiency in vaccine supply in developing countries (and countries in transition to new economic and political systems) and explains special supply concerns about vaccine as a product class. It traces some history of donor support and programmes aimed at self-financing, then continues with a discussion about self-sufficiency in terms of institutional capacity building. A number of deficiencies commonly found in vaccine procurement and supply in low- and middle-income countries are characterized, and institutional strengthening with procurement technical assistance is described. The paper also provides information about a vaccine procurement manual being developed by the United States Agency for International Development (USAID) and the World Health Organization (WHO) for use in this environment. Two brief case studies are included to illustrate the spectrum of existing capabilities and different approaches to technical assistance aimed at developing or improving vaccine procurement capability. In conclusion, the paper discusses the special nature of vaccine and issues surrounding potential integration and decentralization of vaccine supply systems as part of health sector reform.

  12. Why inferential statistics are inappropriate for development studies and how the same data can be better used

    OpenAIRE

    Ballinger, Clint

    2011-01-01

    The purpose of this paper is twofold: 1) to highlight the widely ignored but fundamental problem of 'superpopulations' for the use of inferential statistics in development studies. We do not to dwell on this problem however as it has been sufficiently discussed in older papers by statisticians that social scientists have nevertheless long chosen to ignore; the interested reader can turn to those for greater detail. 2) to show that descriptive statistics both avoid the problem of s...

  13. FOOD SELF-SUFFICIENCY OF THE EUROPEAN UNION COUNTRIES – ENERGETIC APPROACH

    Directory of Open Access Journals (Sweden)

    Arkadiusz Sadowski

    2016-06-01

    Full Text Available The paper covers the issues of a basic social need, namely alimentation. The aim of the research is to evaluate the energetic food self-sufficiency and its changes in the European Union countries. The research has been conducted using the author’s methodology basing on the amount of energy produced and consumed in 1990-2009. The analyses proved that within the considered period, the European Union became an importer of net energy comprised in agricultural products. The excess in produced energy was mainly observed by the countries of European lowland. Moreover in most of the countries, a decrease in the analyzed factor was observed when compared with the 1990-1999 period. On the other hand, in relation to the new member states the increase in food energetic self-sufficiency was observed. The conclusion has been drawn that, while the general food self-sufficiency is mainly determined by environmental factors, its dynamics is primarily influenced by the factors connected with agricultural policy.

  14. Statistical testing of association between menstruation and migraine.

    Science.gov (United States)

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  15. Can New Zealand achieve self-sufficiency in its nursing workforce?

    Science.gov (United States)

    North, Nicola

    2011-01-01

    This paper reviews impacts on the nursing workforce of health policy and reforms of the past two decades and suggests reasons for both current difficulties in retaining nurses in the workforce and measures to achieve short-term improvements. Difficulties in retaining nurses in the New Zealand workforce have contributed to nursing shortages, leading to a dependence on overseas recruitment. In a context of global shortages and having to compete in a global nursing labour market, an alternative to dependence on overseas nurses is self-sufficiency. Discursive paper. Analysis of nursing workforce data highlighted threats to self-sufficiency, including age structure, high rates of emigration of New Zealand nurses with reliance on overseas nurses and an annual output of nurses that is insufficient to replace both expected retiring nurses and emigrating nurses. A review of recent policy and other documents indicates that two decades of health reform and lack of a strategic focus on nursing has contributed to shortages. Recent strategic approaches to the nursing workforce have included workforce stocktakes, integrated health workforce development and nursing workforce projections, with a single authority now responsible for planning, education, training and development for all health professions and sectors. Current health and nursing workforce development strategies offer wide-ranging and ambitious approaches. An alternative approach is advocated: based on workforce data analysis, pressing threats to self-sufficiency and measures available are identified to achieve, in the short term, the maximum impact on retaining nurses. A human resources in health approach is recommended that focuses on employment conditions and professional nursing as well as recruitment and retention strategies. Nursing is identified as 'crucial' to meeting demands for health care. A shortage of nurses threatens delivery of health services and supports the case for self-sufficiency in the nursing

  16. How, When, and Where? Assessing Renewable Energy Self-Sufficiency at the Neighborhood Level.

    Science.gov (United States)

    Grosspietsch, David; Thömmes, Philippe; Girod, Bastien; Hoffmann, Volker H

    2018-02-20

    Self-sufficient decentralized systems challenge the centralized energy paradigm. Although scholars have assessed specific locations and technological aspects, it remains unclear how, when, and where energy self-sufficiency could become competitive. To address this gap, we develop a techno-economic model for energy self-sufficient neighborhoods that integrates solar photovoltaics (PV), conversion, and storage technologies. We assess the cost of 100% self-sufficiency for both electricity and heat, comparing different technical configurations for a stylized neighborhood in Switzerland and juxtaposing these findings with projections on market and technology development. We then broaden the scope and vary the neighborhood's composition (residential share) and geographic position (along different latitudes). Regarding how to design self-sufficient neighborhoods, we find two promising technical configurations. The "PV-battery-hydrogen" configuration is projected to outperform a fossil-fueled and grid-connected reference configuration when energy prices increase by 2.5% annually and cost reductions in hydrogen-related technologies by a factor of 2 are achieved. The "PV-battery" configuration would allow achieving parity with the reference configuration sooner, at 21% cost reduction. Additionally, more cost-efficient deployment is found in neighborhoods where the end-use is small commercial or mixed and in regions where seasonal fluctuations are low and thus allow for reducing storage requirements.

  17. Towards a sufficiency-driven business model : Experiences and opportunities

    NARCIS (Netherlands)

    Bocken, N.M.P.; Short, SW

    2016-01-01

    Business model innovation is an important lever for change to tackle pressing sustainability issues. In this paper, ‘sufficiency’ is proposed as a driver of business model innovation for sustainability. Sufficiency-driven business models seek to moderate overall resource consumption by curbing

  18. A new method to determine the number of experimental data using statistical modeling methods

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jung-Ho; Kang, Young-Jin; Lim, O-Kaung; Noh, Yoojeong [Pusan National University, Busan (Korea, Republic of)

    2017-06-15

    For analyzing the statistical performance of physical systems, statistical characteristics of physical parameters such as material properties need to be estimated by collecting experimental data. For accurate statistical modeling, many such experiments may be required, but data are usually quite limited owing to the cost and time constraints of experiments. In this study, a new method for determining a rea- sonable number of experimental data is proposed using an area metric, after obtaining statistical models using the information on the underlying distribution, the Sequential statistical modeling (SSM) approach, and the Kernel density estimation (KDE) approach. The area metric is used as a convergence criterion to determine the necessary and sufficient number of experimental data to be acquired. The pro- posed method is validated in simulations, using different statistical modeling methods, different true models, and different convergence criteria. An example data set with 29 data describing the fatigue strength coefficient of SAE 950X is used for demonstrating the performance of the obtained statistical models that use a pre-determined number of experimental data in predicting the probability of failure for a target fatigue life.

  19. Online Learning in Higher Education: Necessary and Sufficient Conditions

    Science.gov (United States)

    Lim, Cher Ping

    2005-01-01

    The spectacular development of information and communication technologies through the Internet has provided opportunities for students to explore the virtual world of information. In this article, the author discusses the necessary and sufficient conditions for successful online learning in educational institutions. The necessary conditions…

  20. MRI of the small bowel: can sufficient bowel distension be achieved with small volumes of oral contrast?

    International Nuclear Information System (INIS)

    Kinner, Sonja; Kuehle, Christiane A.; Ladd, Susanne C.; Barkhausen, Joerg; Herbig, Sebastian; Haag, Sebastian; Lauenstein, Thomas C.

    2008-01-01

    Sufficient luminal distension is mandatory for small bowel imaging. However, patients often are unable to ingest volumes of currently applied oral contrast compounds. The aim of this study was to evaluate if administration of low doses of an oral contrast agent with high-osmolarity leads to sufficient and diagnostic bowel distension. Six healthy volunteers ingested at different occasions 150, 300 and 450 ml of a commercially available oral contrast agent (Banana Smoothie Readi-Cat, E-Z-EM; 194 mOsmol/l). Two-dimensional TrueFISP data sets were acquired in 5-min intervals up to 45 min after contrast ingestion. Small bowel distension was quantified using a visual five-grade ranking (5 very good distension, 1 = collapsed bowel). Results were statistically compared using a Wilcoxon-Rank test. Ingestion of 450 ml and 300 ml resulted in a significantly better distension than 150 ml. The all-over average distension value for 450 ml amounted to 3.4 (300 ml: 3.0, 150 ml: 2.3) and diagnostic bowel distension could be found throughout the small intestine. Even 45 min after ingestion of 450 ml the jejunum and ileum could be reliably analyzed. Small bowel imaging with low doses of contrast leads to diagnostic distension values in healthy subjects when a high-osmolarity substance is applied. These findings may help to further refine small bowel MRI techniques, but need to be confirmed in patients with small bowel disorders. (orig.)

  1. High Variability in Cellular Stoichiometry of Carbon, Nitrogen, and Phosphorus Within Classes of Marine Eukaryotic Phytoplankton Under Sufficient Nutrient Conditions.

    Science.gov (United States)

    Garcia, Nathan S; Sexton, Julie; Riggins, Tracey; Brown, Jeff; Lomas, Michael W; Martiny, Adam C

    2018-01-01

    Current hypotheses suggest that cellular elemental stoichiometry of marine eukaryotic phytoplankton such as the ratios of cellular carbon:nitrogen:phosphorus (C:N:P) vary between phylogenetic groups. To investigate how phylogenetic structure, cell volume, growth rate, and temperature interact to affect the cellular elemental stoichiometry of marine eukaryotic phytoplankton, we examined the C:N:P composition in 30 isolates across 7 classes of marine phytoplankton that were grown with a sufficient supply of nutrients and nitrate as the nitrogen source. The isolates covered a wide range in cell volume (5 orders of magnitude), growth rate (temperature (2-24°C). Our analysis indicates that C:N:P is highly variable, with statistical model residuals accounting for over half of the total variance and no relationship between phylogeny and elemental stoichiometry. Furthermore, our data indicated that variability in C:P, N:P, and C:N within Bacillariophyceae (diatoms) was as high as that among all of the isolates that we examined. In addition, a linear statistical model identified a positive relationship between diatom cell volume and C:P and N:P. Among all of the isolates that we examined, the statistical model identified temperature as a significant factor, consistent with the temperature-dependent translation efficiency model, but temperature only explained 5% of the total statistical model variance. While some of our results support data from previous field studies, the high variability of elemental ratios within Bacillariophyceae contradicts previous work that suggests that this cosmopolitan group of microalgae has consistently low C:P and N:P ratios in comparison with other groups.

  2. Assessing sufficient capability: A new approach to economic evaluation.

    Science.gov (United States)

    Mitchell, Paul Mark; Roberts, Tracy E; Barton, Pelham M; Coast, Joanna

    2015-08-01

    Amartya Sen's capability approach has been discussed widely in the health economics discipline. Although measures have been developed to assess capability in economic evaluation, there has been much less attention paid to the decision rules that might be applied alongside. Here, new methods, drawing on the multidimensional poverty and health economics literature, are developed for conducting economic evaluation within the capability approach and focusing on an objective of achieving "sufficient capability". This objective more closely reflects the concern with equity that pervades the capability approach and the method has the advantage of retaining the longitudinal aspect of estimating outcome that is associated with quality-adjusted life years (QALYs), whilst also drawing on notions of shortfall associated with assessments of poverty. Economic evaluation from this perspective is illustrated in an osteoarthritis patient group undergoing joint replacement, with capability wellbeing assessed using ICECAP-O. Recommendations for taking the sufficient capability approach forward are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A new assessment method for demonstrating the sufficiency of the safety assessment and the safety margins of the geological disposal system

    International Nuclear Information System (INIS)

    Ohi, Takao; Kawasaki, Daisuke; Chiba, Tamotsu; Takase, Toshio; Hane, Koji

    2013-01-01

    A new method for demonstrating the sufficiency of the safety assessment and safety margins of the geological disposal system has been developed. The method is based on an existing comprehensive sensitivity analysis method and can systematically identify the successful conditions, under which the dose rate does not exceed specified safety criteria, using analytical solutions for nuclide migration and the results of a statistical analysis. The successful conditions were identified using three major variables. Furthermore, the successful conditions at the level of factors or parameters were obtained using relational equations between the variables and the factors or parameters making up these variables. In this study, the method was applied to the safety assessment of the geological disposal of transuranic waste in Japan. Based on the system response characteristics obtained from analytical solutions and on the successful conditions, the classification of the analytical conditions, the sufficiency of the safety assessment and the safety margins of the disposal system were then demonstrated. A new assessment procedure incorporating this method into the existing safety assessment approach is proposed in this study. Using this procedure, it is possible to conduct a series of safety assessment activities in a logical manner. (author)

  4. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  5. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  6. Refueling availability for alternative fuel vehicle markets: Sufficient urban station coverage

    International Nuclear Information System (INIS)

    Melaina, Marc; Bremson, Joel

    2008-01-01

    Alternative fuel vehicles can play an important role in addressing the challenges of climate change, energy security, urban air pollution and the continued growth in demand for transportation services. The successful commercialization of alternative fuels for vehicles is contingent upon a number of factors, including vehicle cost and performance. Among fuel infrastructure issues, adequate refueling availability is one of the most fundamental to successful commercialization. A commonly cited source reports 164,300 refueling stations in operation nationwide. However, from the perspective of refueling availability, this nationwide count tends to overstate the number of stations required to support the widespread deployment of alternative fuel vehicles. In terms of spatial distribution, the existing gasoline station networks in many urban areas are more than sufficient. We characterize a sufficient level of urban coverage based upon a subset of cities served by relatively low-density station networks, and estimate that some 51,000 urban stations would be required to provide this sufficient level of coverage to all major urban areas, 33 percent less than our estimate of total urban stations. This improved characterization will be useful for engineering, economic and policy analyses. (author)

  7. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  8. Economic Valuation of Sufficient and Guaranteed Irrigation Water Supply for Paddy Farms of Guilan Province

    Directory of Open Access Journals (Sweden)

    Mohammad Kavoosi Kalashami

    2014-08-01

    Full Text Available Cultivation of the strategic crop of rice highly depends to the existence of sufficient and guaranteed irrigation water, and water shortage stresses have irreparable effects on yield and quality of productions. Decrease of the Sefidrud river inflow in Guilan province which is the main source of supplying irrigation water for 171 thousand hectares under rice cropping area of this province, has been challenged sufficient and guaranteed irrigation water supply in many regions of mentioned province. Hence, in present study estimating the value that paddy farmers place on sufficient and guaranteed irrigation water supply has been considered. Economic valuation of sufficient and guaranteed irrigation water supply improves water resource management policies in demand side. Requested data set were obtained on the base of a survey and are collected from 224 paddy farms in rural regions that faced with irrigation water shortages. Then, using open-ended valuation approach and estimation of Tobit model via ML and two stages Heckman approach, eliciting paddy farmers' willingness to pay for sufficient and guaranteed irrigation water supply has been accomplished. Results revealed that farmers in investigated regions willing to pay 26.49 percent more than present costs of providing irrigation water in order to have sufficient and guaranteed irrigation water.

  9. Sufficient education attainment for a decent standard of living in modern Australia

    Directory of Open Access Journals (Sweden)

    Emily Joy Callander

    2012-06-01

    Full Text Available Education attainment will impact upon an individual’s capacity to engage in the labour force, their living standards and hence their poverty status. As such, education should be included in measures of poverty. However, it is not known what a sufficient level of education to have a decent standard of living is. Using the 2003 Survey of Disability, Ageing and Carers different levels of education attainment were tested for their association with labour force participation and income. Based upon this, it was concluded that Year 12 or higher is a sufficient level of education attainment for 15 to 64 year olds; and Year 10 or higher for people over the age of 65 years. This is in line with current government policies to improve Year 12 completion rates. Knowing what a ‘sufficient level of education attainment’ is, allows education to be included in multidimensional measures of poverty that view education as a key dimension of disadvantage.

  10. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    Science.gov (United States)

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  11. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  12. Ideal energy self-sufficient bioclimatic house

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, C.

    1990-04-01

    This paper points out some of the interesting architectural features of a conceptual house being designed to be self-sufficient relative to the use of conventional energy sources. Brief notes are given on the following special design characteristics: the house's orientation and form - essentially a V - shaped two storey design with an orientation such as to maximize the surface area exposed to winter insolation; its special low emissivity glazing equipped with nightfall insulating screens; the adoption of maximized insulation, in which case cost benefits were assessed based on amortization over the entire life span of the house; hybrid space heating and ventilation systems involving the integration of pumps and ventilators for air circulation, and the use of a varied mix of active and passive solar heating and cooling systems.

  13. Statistical approach for selection of biologically informative genes.

    Science.gov (United States)

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  14. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  15. The feasibility and challenges of energy self-sufficient wastewater treatment plants

    International Nuclear Information System (INIS)

    Gu, Yifan; Li, Yue; Li, Xuyao; Luo, Pengzhou; Wang, Hongtao; Robinson, Zoe P.; Wang, Xin; Wu, Jiang; Li, Fengting

    2017-01-01

    Highlights: •Various influencing factors of energy use in WWTPs are characterized. •Benchmark energy consumption in WWTPs in different countries are highlighted. •Energy recovery or saving technologies in WWTPs are summarized. •Recent advances in optimization of energy recovery technologies are highlighted. •Feasibility and challenges of energy self-sufficient WWTPs are explored. -- Abstract: Energy efficiency optimization is crucial for wastewater treatment plants (WWTPs) because of increasing energy costs and concerns about global climate change. Energy efficiency optimization can be achieved through a combination of energy recovery from the wastewater treatment process and energy saving-related technologies. Through these two approaches energy self-sufficiency of WWTPs is achievable, and research is underway to reduce operation costs and energy consumption and to achieve carbon neutrality. In this paper, we analyze energy consumption and recovery in WWTPs and characterize the factors that influence energy use in WWTPs, including treatment techniques, treatment capacities, and regional differences. Recent advances in the optimization of energy recovery technologies and theoretical analysis models for the analysis of different technological solutions are presented. Despite some challenges in implementation, such as technological barriers and high investment costs, particularly in developing countries, this paper highlights the potential for more energy self-sufficient WWTPs to be established in the future.

  16. Is Inferior Alveolar Nerve Block Sufficient for Routine Dental Treatment in 4- to 6-year-old Children?

    Science.gov (United States)

    Pourkazemi, Maryam; Erfanparast, Leila; Sheykhgermchi, Sanaz; Ghanizadeh, Milad

    2017-01-01

    Pain control is one of the most important aspects of behavior management in children. The most common way to achieve pain control is by using local anesthetics (LA). Many studies describe that the buccal nerve innervates the buccal gingiva and mucosa of the mandible for a variable extent from the vicinity of the lower third molar to the lower canine. Regarding the importance of appropriate and complete LA in child-behavior control, in this study, we examined the frequency of buccal gingiva anesthesia of primary mandibular molars and canine after inferior alveolar nerve block injection in 4- to 6-year-old children. In this descriptive cross-sectional study, 220 4- to 6-year-old children were randomly selected and entered into the study. Inferior alveolar nerve block was injected with the same method and standards for all children, and after ensuring the success of block injection, anesthesia of buccal mucosa of primary molars and canine was examined by stick test and reaction of child using sound, eye, motor (SEM) scale. The data from the study were analyzed using descriptive statistics and statistical software Statistical Package for the Social Sciences (SPSS) version 21. The area that was the highest nonanesthetized was recorded as in the distobuccal of the second primary molars. The area of the lowest nonanesthesia was also reported in the gingiva of primary canine tooth. According to this study, in 15 to 30% of cases, after inferior alveolar nerve block injection, the primary mandibular molars' buccal mucosa is not anesthetized. How to cite this article: Pourkazemi M, Erfanparast L, Sheykhgermchi S, Ghanizadeh M. Is Inferior Alveolar Nerve Block Sufficient for Routine Dental Treatment in 4- to 6-year-old Children? Int J Clin Pediatr Dent 2017;10(4):369-372.

  17. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  18. Statistical parity-time-symmetric lasing in an optical fibre network.

    Science.gov (United States)

    Jahromi, Ali K; Hassan, Absar U; Christodoulides, Demetrios N; Abouraddy, Ayman F

    2017-11-07

    Parity-time (PT)-symmetry in optics is a condition whereby the real and imaginary parts of the refractive index across a photonic structure are deliberately balanced. This balance can lead to interesting optical phenomena, such as unidirectional invisibility, loss-induced lasing, single-mode lasing from multimode resonators, and non-reciprocal effects in conjunction with nonlinearities. Because PT-symmetry has been thought of as fragile, experimental realisations to date have been usually restricted to on-chip micro-devices. Here, we demonstrate that certain features of PT-symmetry are sufficiently robust to survive the statistical fluctuations associated with a macroscopic optical cavity. We examine the lasing dynamics in optical fibre-based coupled cavities more than a kilometre in length with balanced gain and loss. Although fluctuations can detune the cavity by more than the free spectral range, the behaviour of the lasing threshold and the laser power is that expected from a PT-stable system. Furthermore, we observe a statistical symmetry breaking upon varying the cavity loss.

  19. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  20. Complete self-sufficiency planning: designing and building disaster-ready hospitals.

    Science.gov (United States)

    Brands, Chad K; Hernandez, Raquel G; Stenberg, Arnold; Carnes, Gary; Ellen, Jonathan; Epstein, Michael; Strouse, Timothy

    2013-01-01

    The need for healthcare systems and academic medical centers to be optimally prepared in the event of a disaster is well documented. Events such as Hurricane Katrina demonstrate a major gap in disaster preparedness for at-risk medical institutions. To address this gap, we outline the components of complete self-sufficiency planning in designing and building hospitals that will function at full operational capacity in the event of a disaster. We review the processes used and outcomes achieved in building a new critical access, freestanding children's hospital in Florida. Given that hurricanes are the most frequently occurring natural disaster in Florida, the executive leadership of our hospital determined that we should be prepared for worst-case scenarios in the design and construction of a new hospital. A comprehensive vulnerability assessment was performed. A building planning process that engaged all of the stakeholders was used during the planning and design phases. Subsequent executive-level review and discussions determined that a disaster would require the services of a fully functional hospital. Lessons learned from our own institution's previous experiences and those of medical centers involved in the Hurricane Katrina disaster were informative and incorporated into an innovative set of hospital design elements used for construction of a new hospital with full operational capacity in a disaster. A freestanding children's hospital was constructed using a new framework for disaster planning and preparedness that we have termed complete self-sufficiency planning. We propose the use of complete self-sufficiency planning as a best practice for disaster preparedness in the design and construction of new hospital facilities.

  1. The Statistical Properties of Host Load

    Directory of Open Access Journals (Sweden)

    Peter A. Dinda

    1999-01-01

    Full Text Available Understanding how host load changes over time is instrumental in predicting the execution time of tasks or jobs, such as in dynamic load balancing and distributed soft real‐time systems. To improve this understanding, we collected week‐long, 1 Hz resolution traces of the Digital Unix 5 second exponential load average on over 35 different machines including production and research cluster machines, compute servers, and desktop workstations. Separate sets of traces were collected at two different times of the year. The traces capture all of the dynamic load information available to user‐level programs on these machines. We present a detailed statistical analysis of these traces here, including summary statistics, distributions, and time series analysis results. Two significant new results are that load is self‐similar and that it displays epochal behavior. All of the traces exhibit a high degree of self‐similarity with Hurst parameters ranging from 0.73 to 0.99, strongly biased toward the top of that range. The traces also display epochal behavior in that the local frequency content of the load signal remains quite stable for long periods of time (150–450 s mean and changes abruptly at epoch boundaries. Despite these complex behaviors, we have found that relatively simple linear models are sufficient for short‐range host load prediction.

  2. Statistical Power in Plant Pathology Research.

    Science.gov (United States)

    Gent, David H; Esker, Paul D; Kriss, Alissa B

    2018-01-01

    In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even

  3. Statistical issues in the estimation of assigned shares for carcinogenesis liability

    International Nuclear Information System (INIS)

    Cox, L.A. Jr.

    1987-01-01

    Congress is currently considering adopting a mathematical formula to assign shares in cancer causation to specific doses of radiation, for use in establishing liability and compensation awards. The proposed formula, if it were sound, would allow difficult problems in tort law and public policy to be resolved by reference to tabulated probabilities of causation. This article examines the statistical and conceptual bases for the proposed methodology. We find that the proposed formula is incorrect as an expression for probability and causation, that it implies hidden, debatable policy judgments in its treatment of factor interactions and uncertainties, and that it can not in general be quantified with sufficient precision to be useful. Three generic sources of statistical uncertainty are identified--sampling variability, population heterogeneity, and error propagation--that prevent accurate quantification of assigned shares. These uncertainties arise whenever aggregate epidemiological or risk data are used to draw causal inferences about individual cases

  4. Sufficient condition for black-hole formation in spherical gravitational collapse

    International Nuclear Information System (INIS)

    Giambo, Roberto; Giannoni, Fabio; Magli, Giulio

    2002-01-01

    A sufficient condition for the validity of cosmic censorship in spherical gravitational collapse is formulated and proved. The condition relies on an attractive mathematical property of the apparent horizon, which holds if 'minimal' requirements of physical reasonableness are satisfied by the matter model. (letter to the editor)

  5. Stability of matrices with sufficiently strong negative-dominant-diagonal submatrices

    NARCIS (Netherlands)

    Nieuwenhuis, H.J.; Schoonbeek, L.

    A well-known sufficient condition for stability of a system of linear first-order differential equations is that the matrix of the homogeneous dynamics has a negative dominant diagonal. However, this condition cannot be applied to systems of second-order differential equations. In this paper we

  6. Sufficient Stochastic Maximum Principle in a Regime-Switching Diffusion Model

    Energy Technology Data Exchange (ETDEWEB)

    Donnelly, Catherine, E-mail: C.Donnelly@hw.ac.uk [Heriot-Watt University, Department of Actuarial Mathematics and Statistics (United Kingdom)

    2011-10-15

    We prove a sufficient stochastic maximum principle for the optimal control of a regime-switching diffusion model. We show the connection to dynamic programming and we apply the result to a quadratic loss minimization problem, which can be used to solve a mean-variance portfolio selection problem.

  7. Sufficient Stochastic Maximum Principle in a Regime-Switching Diffusion Model

    International Nuclear Information System (INIS)

    Donnelly, Catherine

    2011-01-01

    We prove a sufficient stochastic maximum principle for the optimal control of a regime-switching diffusion model. We show the connection to dynamic programming and we apply the result to a quadratic loss minimization problem, which can be used to solve a mean-variance portfolio selection problem.

  8. Lead–acid batteries coupled with photovoltaics for increased electricity self-sufficiency in households

    International Nuclear Information System (INIS)

    Oliveira e Silva de, Guilherme; Hendrick, Patrick

    2016-01-01

    Highlights: • Grid parity is reached for PV installations up to nearly 40% self-sufficiency. • Reaching beyond 40% self-sufficiency requires storage and support policies. • Peak consumption remains constant but load variability rises with self-sufficiency. • Changes in power plants portfolio and wholesale electricity prices are expected. • Limiting feed-in power is a promising solution for reducing load variability. - Abstract: With distributed generation of electricity growing in importance (especially with photovoltaics) and buildings being one of the main consumers of energy in modern societies, distributed storage of energy in buildings is expected to become increasingly present. This paper analyses the use of residential lead–acid energy storage coupled with photovoltaics and its possible interaction with the grid for different limits of feed-in power without any support policies. In the literature, these subjects are often treated independently and for very specific, non-optimised cases, thus motivating further research. Results show that reaching self-sufficiency values up to 40% is possible, close to grid parity values, and only with photovoltaics. Beyond 40%, energy storage must be used, strongly raising the cost of the electricity consumed and therefore the need for support policies for widespread adoption. Also, peak power consumption from the grid remains constant and load variability rises, suggesting that an increase in self-sufficiency would be accompanied by lower utilisation factors of power plants and, consequently, higher wholesale electricity prices during no sunshine hours. Limiting feed-in power attenuates the increased load variability and only slightly affects the economic viability of such installations. These results present a novel optimisation tool for developers and should be considered in future studies of distributed photovoltaics and energy storage as well as in energy policy.

  9. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  10. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Sufficient condition for black-hole formation in spherical gravitational collapse

    Energy Technology Data Exchange (ETDEWEB)

    Giambo, Roberto [Dipartimento di Matematica e Fisica, Universita di Camerino (Italy); Giannoni, Fabio [Dipartimento di Matematica e Fisica, Universita di Camerino (Italy); Magli, Giulio [Dipartimento di Matematica, Politecnico di Milano (Italy)

    2002-01-21

    A sufficient condition for the validity of cosmic censorship in spherical gravitational collapse is formulated and proved. The condition relies on an attractive mathematical property of the apparent horizon, which holds if 'minimal' requirements of physical reasonableness are satisfied by the matter model. (letter to the editor)

  12. Energy self-sufficient sensory ball screw drive; Energieautarker sensorischer Kugelgewindetrieb

    Energy Technology Data Exchange (ETDEWEB)

    Bertram, Oliver

    2012-07-01

    Nowadays the availability of machine tools plays a decisive role in competition to increase in productivity. From state of the art it arises, that ball screw drives are the most abusive component in feed drives because of abrasive wear. Furthermore condition monitoring enables avoiding unplanned machine failure and increasing the availability of the deployed production facility. Thereby the application of additional sensors allows the direct acquisition of wear correlative measurements. To reduce the required effort for integration and increase the robustness, reliability and clarity in industrial environment energy self-sufficient sensor systems can be applied. In this thesis the development and investigation of an energy self-sufficient sensory ball screw drive with direct measurement of wear correlative pretension for condition monitoring application is described. The prototype measures the pretension with force sensors based on strain gauges. The sensor system includes microcontroller-based electronics for signal processing as well as wireless data transmission with ZigBee-standard. A hybrid system assures the energy supply of the sensor system. On the one hand a stepper motor generator produces electrical energy from the motion energy of the ball screw drive. On the other hand an energy buffer based on super caps is reloaded in stationary position by wireless energy transmission. For verification a prototype system is build up. In measurements the sensory and energetic characteristics of the energy self-sufficient sensor systems are analyzed. Moreover, the functionality of the ball screw drive as well as the signal characteristics of the force sensors are examined for different pretensions. In addition, pretension losses due to wear are established in realized endurance trials, which means that timely maintenance can be planned.

  13. INSTITUTIONAL MANAGEMENT OF EUROPEAN STATISTICS AND OF THEIR QUALITY - CURRENT CONCERNS AT EUROPEAN LEVEL

    Directory of Open Access Journals (Sweden)

    Daniela ŞTEFĂNESCU

    2011-08-01

    Full Text Available The issues referring to official statistics quality and reliability became the main topics of debates as far as statistical governance in Europe is concerned. The Council welcomed the Commission Communication to the European Parliament and to the Council « Towards robust quality management for European Statistics » (COM 211, appreciating that the approach and the objective of the strategy would confer the European Statistical System (ESS the quality management framework for the coordination of consolidated economic policies. The Council pointed out that the European Statistical System management was improved during recent years, that progress was noticed in relation with high quality statistics production and dissemination within the European Union, but has also noticed that, in the context of recent financial crisis, certain weaknesses were identified, particularly related to quality management general framework.„Greece Case” proved that progresses were not enough for guaranteeing the complete independence of national statistical institutes and entailed the need for further consolidating ESS governance. Several undertakings are now in the preparatory stage, in accordance with the Commission Communication; these actions are welcomed, but the question arise: are these sufficient for definitively solving the problem?The paper aims to go ahead in the attempt of identifying a different way, innovative (courageous! on the long run, towards an advanced institutional structure of ESS, by setting up the European System of Statistical Institutes, similar to the European System of Central Banks, that would require a change in the Treaty.

  14. Staple Food Self-Sufficiency of Farmers Household Level in The Great Solo

    Science.gov (United States)

    Darsono

    2017-04-01

    Analysis of food security level of household is a novelty of measurement standards which usually includes regional and national levels. With household approach is expected to provide the basis of sharp food policy formulation. The purpose of this study are to identify the condition of self-sufficiency in staple foods, and to find the main factors affecting the dynamics of self-sufficiency in staple foods on farm household level in Great Solo. Using primary data from 50 farmers in the sample and secondary data in Great Solo (Surakarta city, Boyolali, Sukoharjo, Karanganyar, Wonogiri, Sragen and Klaten). Compiled panel data were analyzed with linear probability regression models to produce a good model. The results showed that farm households in Great Solo has a surplus of staple food (rice) with an average consumption rate of 96.8 kg/capita/year. This number is lower than the national rate of 136.7 kg/capita/year. The main factors affecting the level of food self-sufficiency in the farmer household level are: rice production, rice consumption, land tenure, and number of family members. Key recommendations from this study are; improvement scale of the land cultivation for rice farming and non-rice diversification consumption.

  15. The statistical decay of very hot nuclei: from sequential decay to multifragmentation

    International Nuclear Information System (INIS)

    Carlson, B.V.; Donangelo, R.; Universidad de la Republica, Montevideo; Souza, S.R.; Universidade Federal do Rio Grande do Sul; Lynch, W.G.; Steiner, A.W.; Tsang, M.B.

    2010-01-01

    Full text. At low excitation energies, the compound nucleus typically decays through the sequential emission of light particles. As the energy increases, the emission probability of heavier fragments increases until, at sufficiently high energies, several heavy complex fragments are emitted during the decay. The extent to which this fragment emission is simultaneous or sequential has been a subject of theoretical and experimental study for almost 30 years. The Statistical Multifragmentation Model, an equilibrium model of simultaneous fragment emission, uses the configurations of a statistical ensemble to determine the distribution of primary fragments of a compound nucleus. The primary fragments are then assumed to decay by sequential compound emission or Fermi breakup. As the first step toward a more unified model of these processes, we demonstrate the equivalence of a generalized Fermi breakup model, in which densities of excited states are taken into account, to the microcanonical version of the statistical multifragmentation model. We then establish a link between this unified Fermi breakup / statistical multifragmentation model and the well-known process of compound nucleus emission, which permits to consider simultaneous and sequential emission on the same footing. Within this unified framework, we analyze the increasing importance of simultaneous, multifragment decay with increasing excitation energy and decreasing lifetime of the compound nucleus. (author)

  16. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  17. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  18. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  19. A solar house self-sufficient of energy. Experiences on the way to energy autarky

    International Nuclear Information System (INIS)

    Voss, K.; Dohlen, K. v.; Lehmberg, H.; Stahl, W.; Wittwer, C.; Goetzberger, A.

    1994-01-01

    The solar house Freiburg which is self-sufficient of energy was completed in October 1992. After a long and complex planning phase now measuring and monitoring tasks as well as the realization fo improvement measures are to the fore. This article presents exemplary results of the first year of operation and compare them with the expectations. Self-sufficient operation of the building could be attained between April and October 1993. Here among others hydrogen was successfully produced by photovoltaic supplied electrolysis and was to a large degree used for thermal applications (cooking, heating). The fact that the supply of energy was not self-sufficient all the year round was due to the failure of the fuel cell used to produce electric power again with hydrogen. (orig./BWI) [de

  20. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  1. Stable dynamics in forced systems with sufficiently high/low forcing frequency.

    Science.gov (United States)

    Bartuccelli, M; Gentile, G; Wright, J A

    2016-08-01

    We consider parametrically forced Hamiltonian systems with one-and-a-half degrees of freedom and study the stability of the dynamics when the frequency of the forcing is relatively high or low. We show that, provided the frequency is sufficiently high, Kolmogorov-Arnold-Moser (KAM) theorem may be applied even when the forcing amplitude is far away from the perturbation regime. A similar result is obtained for sufficiently low frequency, but in that case we need the amplitude of the forcing to be not too large; however, we are still able to consider amplitudes which are outside of the perturbation regime. In addition, we find numerically that the dynamics may be stable even when the forcing amplitude is very large, well beyond the range of validity of the analytical results, provided the frequency of the forcing is taken correspondingly low.

  2. When is there sufficient information from the Site Investigations?

    International Nuclear Information System (INIS)

    Andersson, Johan; Munier, Raymond; Stroem, Anders; Soederbaeck, Bjoern; Almen, Karl-Erik; Olsson, Lars

    2004-04-01

    SKB has started site investigations for a deep repository for spent nuclear fuel at two different sites in Sweden. The investigations should provide necessary information for a licence application aimed at starting underground exploration. The investigations and analyses of them are supposed to provide the broad knowledge base that is required to achieve the overall goals of the site investigation phase. The knowledge will be utilized to evaluate the suitability of investigated sites for the deep repository and must be comprehensive enough to: Show whether the selected site satisfies requirements on safety and technical aspects. Serve as a basis for adaptation of the deep repository to the characteristics of the site with an acceptable impact on society and the environment. Permit comparisons with other investigated sites. Furthermore, the investigations are discontinued when the reliability of the site description has reached such a level that the body of data for safety assessment and design is sufficient, or until the body of data shows that the rock does not satisfy the requirements. These objectives are valid, but do not provide sufficient and concrete guidance. For this reason SKB has conducted this project which should acquire concrete guidance on how to judge when the surface based Site Investigation Phase does not need to continue. After a general assessment of the problem, the following specific objectives of the current work were identified: Demonstrate concretely how the assessed uncertainties in a Site Description based on a specific level of investigations, together with expected feedback from Safety Assessment and Engineering, can be used to decide whether the site investigations are sufficient - or need to continue. This demonstration will be based on a practical application of relevant aspects of decision analysis tools. Highlight and make concrete the type of feedback to be expected from Safety Assessment and Engineering and show how this feedback

  3. When is there sufficient information from the Site Investigations?

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Munier, Raymond; Stroem, Anders; Soederbaeck, Bjoern [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Almen, Karl-Erik [KEA Geo-konsult (Sweden); Olsson, Lars [Geostatistik AB, Tumba (Sweden)

    2004-04-01

    SKB has started site investigations for a deep repository for spent nuclear fuel at two different sites in Sweden. The investigations should provide necessary information for a licence application aimed at starting underground exploration. The investigations and analyses of them are supposed to provide the broad knowledge base that is required to achieve the overall goals of the site investigation phase. The knowledge will be utilized to evaluate the suitability of investigated sites for the deep repository and must be comprehensive enough to: Show whether the selected site satisfies requirements on safety and technical aspects. Serve as a basis for adaptation of the deep repository to the characteristics of the site with an acceptable impact on society and the environment. Permit comparisons with other investigated sites. Furthermore, the investigations are discontinued when the reliability of the site description has reached such a level that the body of data for safety assessment and design is sufficient, or until the body of data shows that the rock does not satisfy the requirements. These objectives are valid, but do not provide sufficient and concrete guidance. For this reason SKB has conducted this project which should acquire concrete guidance on how to judge when the surface based Site Investigation Phase does not need to continue. After a general assessment of the problem, the following specific objectives of the current work were identified: Demonstrate concretely how the assessed uncertainties in a Site Description based on a specific level of investigations, together with expected feedback from Safety Assessment and Engineering, can be used to decide whether the site investigations are sufficient - or need to continue. This demonstration will be based on a practical application of relevant aspects of decision analysis tools. Highlight and make concrete the type of feedback to be expected from Safety Assessment and Engineering and show how this feedback

  4. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.

    2017-04-17

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  5. Structural characterization and condition for measurement statistics preservation of a unital quantum operation

    International Nuclear Information System (INIS)

    Lee, Kai-Yan; Fung, Chi-Hang Fred; Chau, H F

    2013-01-01

    We investigate the necessary and sufficient condition for a convex cone of positive semidefinite operators to be fixed by a unital quantum operation ϕ acting on finite-dimensional quantum states. By reducing this problem to the problem of simultaneous diagonalization of the Kraus operators associated with ϕ, we can completely characterize the kinds of quantum states that are fixed by ϕ. Our work has several applications. It gives a simple proof of the structural characterization of a unital quantum operation that acts on finite-dimensional quantum states—a result not explicitly mentioned in earlier studies. It also provides a necessary and sufficient condition for determining what kind of measurement statistics is preserved by a unital quantum operation. Finally, our result clarifies and extends the work of Størmer by giving a proof of a reduction theorem on the unassisted and entanglement-assisted classical capacities, coherent information, and minimal output Renyi entropy of a unital channel acting on a finite-dimensional quantum state. (paper)

  6. Pathogens in Sludge: A Case of Sufficient Challenge

    Energy Technology Data Exchange (ETDEWEB)

    Lesilind, R. Aarne

    2003-07-01

    There is increasing pressure in many countries to strengthen the regulations controlling the land disposal of wastewater sludges. In this paper I argue that although there is little doubt that sludges from wastewater treatment contain pathogenic organisms, not only are there no data to show that such disposal is a public health problem, but I want to suggest that small doses of pathogens in the environment provide a ''sufficient challenge'' that actually enhances public health. There therefore seems little reason, from a public health standpoint, to pass stricter sludge disposal regulations. (author)

  7. Operation of CANDU power reactor in thorium self-sufficient fuel cycle

    Indian Academy of Sciences (India)

    These disadvantages of thorium fuel cycle were seemingly the reasons why that ... According to the data of figure 2, maximum (equilibrium) content of 233U in ..... Self-sufficient mode is related with rather big effort in the extraction of isotopes of.

  8. Sufficient conditions for robust BIBO stabilization : given by the gap metric

    NARCIS (Netherlands)

    Zhu, S.Q.; Hautus, M.L.J.; Praagman, C.

    1988-01-01

    A relation between coprime fractions and the gap metric is presented. Using this result we provide some sufficient conditions for robust BIBO stabilization for a wide class of systems. These conditions allow the plant and the compensator to be disturbed simultaneously.

  9. Three-dimensional electromagnetic strong turbulence. I. Scalings, spectra, and field statistics

    International Nuclear Information System (INIS)

    Graham, D. B.; Robinson, P. A.; Cairns, Iver H.; Skjaeraasen, O.

    2011-01-01

    The first fully three-dimensional (3D) simulations of large-scale electromagnetic strong turbulence (EMST) are performed by numerically solving the electromagnetic Zakharov equations for electron thermal speeds ν e with ν e /c≥0.025. The results of these simulations are presented, focusing on scaling behavior, energy density spectra, and field statistics of the Langmuir (longitudinal) and transverse components of the electric fields during steady-state strong turbulence, where multiple wave packets collapse simultaneously and the system is approximately statistically steady in time. It is shown that for ν e /c > or approx. 0.17 strong turbulence is approximately electrostatic and can be explained using the electrostatic two-component model. For v e /c > or approx. 0.17 the power-law behaviors of the scalings, spectra, and field statistics differ from the electrostatic predictions and results because ν e /c is sufficiently high to allow transverse modes to become trapped in density wells. The results are compared with those of past 3D electrostatic strong turbulence (ESST) simulations and 2D EMST simulations. For number density perturbations, the scaling behavior, spectra, and field statistics are shown to be only weakly dependent on ν e /c, whereas the Langmuir and transverse scalings, spectra, and field statistics are shown to be strongly dependent on ν e /c. Three-dimensional EMST is shown to have features in common with 2D EMST, such as a two-component structure and trapping of transverse modes which are dependent on ν e /c.

  10. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  11. Sufficient conditions for BIBO robust stabilization : given by the gap metric

    NARCIS (Netherlands)

    Zhu, S.Q.; Hautus, M.L.J.; Praagman, C.

    1987-01-01

    A relation between coprlme fractions and the gap metric is presented. Using this result we provide some sufficient conditions for BIBO robust stabilization for a very wide class of systems. These conditions allow the plant and compensator to be disturbed simultaneously. Keywords: Robust

  12. Necessary and Sufficient Condition for Local Exponential Synchronization of Nonlinear Systems

    NARCIS (Netherlands)

    Andrieu, Vincent; Jayawardhana, Bayu; Tarbouriech, Sophie

    2015-01-01

    Based on recent works on transverse exponential stability, some necessary and sufficient conditions for the existence of a (locally) exponential synchronizer are established. We show that the existence of a structured synchronizer is equivalent to the existence of a stabilizer for the individual

  13. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  14. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany

    2016-04-26

    This presentation provides an overview of new and ongoing NREL research that aims to improve our understanding of reliability and revenue sufficiency challenges through modeling tools within a markets framework.

  15. The Effects of Simple Necessity and Sufficiency Relationships on Children's Causal Inferences

    Science.gov (United States)

    Siegler, Robert S.

    1976-01-01

    Attempted to determine (1) whether developmental differences existed in children's comprehension of simple necessity and simple sufficiency relationships, and (2) the source of developmental differences in children's causal reasoning. (SB)

  16. Application of statistical dynamical turbulence closures to data assimilation

    International Nuclear Information System (INIS)

    O'Kane, Terence J; Frederiksen, Jorgen S

    2010-01-01

    We describe the development of an accurate yet computationally tractable statistical dynamical closure theory for general inhomogeneous turbulent flows, coined the quasi-diagonal direct interaction approximation closure (QDIA), and its application to problems in data assimilation. The QDIA provides prognostic equations for evolving mean fields, covariances and higher-order non-Gaussian terms, all of which are also required in the formulation of data assimilation schemes for nonlinear geophysical flows. The QDIA is a generalization of the class of direct interaction approximation theories, initially developed by Kraichnan (1959 J. Fluid Mech. 5 497) for isotropic turbulence, to fully inhomogeneous flows and has been further generalized to allow for both inhomogeneous and non-Gaussian initial conditions and long integrations. A regularization procedure or empirical vertex renormalization that ensures correct inertial range spectra is also described. The aim of this paper is to provide a coherent mathematical description of the QDIA turbulence closure and closure-based data assimilation scheme we have labeled the statistical dynamical Kalman filter. The mathematical formalism presented has been synthesized from recent works of the authors with some additional material and is presented in sufficient detail that the paper is of a pedagogical nature.

  17. Actual problems of accession in relation with library statistics

    Directory of Open Access Journals (Sweden)

    Tereza Poličnik-Čermelj

    2010-01-01

    Full Text Available Accession is the process of recording bibliographic units in an accession register. Typically,library materials are acquired by purchase, exchange, gift or legal deposit. How-ever, COBISS (Cooperative Online Bibliographic System and Services Holdings software module includes some additional methods of acquisition which causes problems in gathering and presenting statistical data on local holdings. The article explains how to record holdings of different types of library materials and how to record retrospective collections. It describes necessary procedures in case the codes that define the publication pattern of the holdings are changed with special attention to integrating resources. Procedures of accession and circulation of bound materials, supplementary materials, teaching sets, multi parts, multimedia and collection level catalogue records are described. The attention is given to errors in recording lost item replacements and to the problems of circulation of certain types of library materials. The author also suggests how to record remote electronic resources. It is recommended to verify holdings data before the accession register is generated. The relevant and credible statistical data on collection development can only be created by librarians with sufficient acquisition and cataloguing skills.

  18. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  19. CaFe2O4 as a self-sufficient solar energy converter

    Science.gov (United States)

    Tablero, C.

    2017-10-01

    An ideal solar energy to electricity or fuel converter should work without the use of any external bias potential. An analysis of self-sufficiency when CaFe2O4 is used to absorb the sunlight is carried out based on the CaFe2O4 absorption coefficient. We started to obtain this coefficient theoretically within the experimental bandgap range in order to fix the interval of possible values of photocurrents, maximum absorption efficiencies, and photovoltages and thus that of self-sufficiency considering only the radiative processes. Also for single-gap CaFe2O4, we evaluate an alternative for increasing the photocurrent and maximum absorption efficiency based on inserting an intermediate band using high doping or alloying.

  20. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  1. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  2. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  3. Scientific Opinion on Statistical considerations for the safety evaluation of GMOs

    DEFF Research Database (Denmark)

    Sørensen, Ilona Kryspin

    in the experimental design of field trials, such as the inclusion of commercial varieties, in order to ensure sufficient statistical power and reliable estimation of natural variability. A graphical representation is proposed to allow the comparison of the GMO, its conventional counterpart and the commercial...... such estimates are unavailable may they be estimated from databases or literature. Estimated natural variability should be used to specify equivalence limits to test the difference between the GMO and the commercial varieties. Adjustments to these equivalence limits allow a simple graphical representation so...... in this opinion may be used, in certain cases, for the evaluation of GMOs other than plants....

  4. Statistical distributions of optimal global alignment scores of random protein sequences

    Directory of Open Access Journals (Sweden)

    Tang Jiaowei

    2005-10-01

    Full Text Available Abstract Background The inference of homology from statistically significant sequence similarity is a central issue in sequence alignments. So far the statistical distribution function underlying the optimal global alignments has not been completely determined. Results In this study, random and real but unrelated sequences prepared in six different ways were selected as reference datasets to obtain their respective statistical distributions of global alignment scores. All alignments were carried out with the Needleman-Wunsch algorithm and optimal scores were fitted to the Gumbel, normal and gamma distributions respectively. The three-parameter gamma distribution performs the best as the theoretical distribution function of global alignment scores, as it agrees perfectly well with the distribution of alignment scores. The normal distribution also agrees well with the score distribution frequencies when the shape parameter of the gamma distribution is sufficiently large, for this is the scenario when the normal distribution can be viewed as an approximation of the gamma distribution. Conclusion We have shown that the optimal global alignment scores of random protein sequences fit the three-parameter gamma distribution function. This would be useful for the inference of homology between sequences whose relationship is unknown, through the evaluation of gamma distribution significance between sequences.

  5. Sufficient and necessary condition of separability for generalized Werner states

    International Nuclear Information System (INIS)

    Deng Dongling; Chen Jingling

    2009-01-01

    In a celebrated paper [Optics Communications 179, 447, 2000], A.O. Pittenger and M.H. Rubin presented for the first time a sufficient and necessary condition of separability for the generalized Werner states. Inspired by their ideas, we generalized their method to a more general case. We obtain a sufficient and necessary condition for the separability of a specific class of N d-dimensional system (qudits) states, namely special generalized Werner state (SGWS): W [d N ] (v)=(1-v)(I (N) )/(d N ) +v|ψ d N > d N |, where |ψ d N >=Σ i=0 d-1 α i |i...i> is an entangled pure state of N qudits system and α i satisfies two restrictions: (i) Σ i=0 d-1 α i α i *=1; (ii) Matrix 1/d (I (1) +TΣ i≠j α i |i> j *), where T=Min i≠j {1/|α i α j |}, is a density matrix. Our condition gives quite a simple and efficiently computable way to judge whether a given SGWS is separable or not and previously known separable conditions are shown to be special cases of our approach

  6. PENENTUAN CADANGAN PREMI DENGAN METODE PREMIUM SUFFICIENCY PADA ASURANSI JIWA SEUMUR HIDUP JOINT LIFE

    OpenAIRE

    NI PUTU MIRAH PERMATASARI; I NYOMAN WIDANA; KARTIKA SARI

    2016-01-01

    The aim of this research was to get the formula of premium reserves through the premium sufficiency method. Premium reserve is the amount of fund that is collected by the insurance company in preparation for the claim’s payment. Premium sufficiency method is gross premium calculation. To construct that formula, this research used Tabel Mortalitas Indonesia (TMI) 2011, interest rate 2.5% and cost of alpha %. Based on simulation result in men premium reserve value of age 1 of 56 years propotio...

  7. The effect of formal training of cardiopulmonary resuscitation (CPR skills on medical students perceived self-sufficiency

    Directory of Open Access Journals (Sweden)

    Shaghaghi A

    2004-07-01

    Full Text Available Background: Experience of cardiopulmonary resuscitation (CPR in real clinical setting is not easily possible for all medical students. Purpose: To assess medical student perceived self-sufficiency on three procedural skill on internship courses after they had taken a training course in clerkship period. Methods: Forty three medical students who had attended a workshop on CPR, tracheal intubations and venopuncture answered the questionnaires on their perceived self-sufficiency in performing these procedures after serving a few months as interns. Results: The mean score for perceived self-sufficiency (PSS was 75.84 (±18.63.Thre were a high correlation between the score given for the applicability of training in real life situation and the stress reduction scores on first time performing the procedure. Conclusion: The high degree of correlation between PSS scores and applicability scores, may warrant the consideration of new methods in procedural skills. Keywords: SKILL TRAINING, CPR TRAINING, PERCEIVED SELF-SUFFICIENCY

  8. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  9. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  10. Statistical mechanics for a class of quantum statistics

    International Nuclear Information System (INIS)

    Isakov, S.B.

    1994-01-01

    Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived

  11. Operation of CANDU power reactor in thorium self-sufficient fuel cycle

    Indian Academy of Sciences (India)

    This paper presents the results of calculations for CANDU reactor operation in thorium fuel cycle. Calculations are performed to estimate the feasibility of operation of heavy-water thermal neutron power reactor in self-sufficient thorium cycle. Parameters of active core and scheme of fuel reloading were considered to be the ...

  12. Sufficiency and Duality for Multiobjective Programming under New Invexity

    Directory of Open Access Journals (Sweden)

    Yingchun Zheng

    2016-01-01

    Full Text Available A class of multiobjective programming problems including inequality constraints is considered. To this aim, some new concepts of generalized F,P-type I and F,P-type II functions are introduced in the differentiable assumption by using the sublinear function F. These new functions are used to establish and prove the sufficient optimality conditions for weak efficiency or efficiency of the multiobjective programming problems. Moreover, two kinds of dual models are formulated. The weak dual, strong dual, and strict converse dual results are obtained under the aforesaid functions.

  13. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  14. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  15. Process simulation and statistical approaches for validating waste form qualification models

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Toland, M.R.; Pulsipher, B.A.

    1989-05-01

    This report describes recent progress toward one of the principal objectives of the Nuclear Waste Treatment Program (NWTP) at the Pacific Northwest Laboratory (PNL): to establish relationships between vitrification process control and glass product quality. during testing of a vitrification system, it is important to show that departures affecting the product quality can be sufficiently detected through process measurements to prevent an unacceptable canister from being produced. Meeting this goal is a practical definition of a successful sampling, data analysis, and process control strategy. A simulation model has been developed and preliminarily tested by applying it to approximate operation of the West Valley Demonstration Project (WVDP) vitrification system at West Valley, New York. Multivariate statistical techniques have been identified and described that can be applied to analyze large sets of process measurements. Information on components, tanks, and time is then combined to create a single statistic through which all of the information can be used at once to determine whether the process has shifted away from a normal condition

  16. ANALYSIS OF SUFFICIENCY OF THE BEARING CAPACITY OF BUILDING STRUCTURES OF OPERATING SITES OF MAIN BUILDINGS OF THERMAL POWER PLANTS

    Directory of Open Access Journals (Sweden)

    Alekseeva Ekaterina Leonidovna

    2012-10-01

    Full Text Available Upon examination of eleven main buildings of power plants, analysis of defects and damages of building structures was performed. Thereafter, the damageability of principal bearing structures of main buildings of thermal plants was analyzed. It was identified that the fastest growing defects and damages were concentrated in the structures of operating sites. The research of the rate of development of the most frequent damages and defects made it possible to conclude that internal corrosion of the reinforcing steel was the most dangerous defect, as far as the reinforced concrete elements of operating sites were concerned. Methods of mathematical statistics were applied to identify the reinforcing steel development pattern inside reinforced concrete elements of floors of operating sites. It was identified that the probability of corrosion of reinforced concrete elements of operating sites was distributed in accordance with the demonstrative law. Based on these data, calculation of strength of reinforced concrete slabs and metal beams was performed in terms of their regular sections, given the natural loads and the realistic condition of structures. As a result, dependence between the bearing capacity reserve ratio and the corrosion development pattern was identified for reinforced concrete slabs and metal beams of operating sites. In order to analyze the sufficiency of the bearing capacity of building structures of operating sites in relation to their time in commission, equations were derived to identify the nature of dependence between the sufficiency of the bearing capacity of reinforced concrete slabs and metal beams of the operating sites and their time in commission.

  17. Prediction of noise in ships by the application of “statistical energy analysis.”

    DEFF Research Database (Denmark)

    Jensen, John Ødegaard

    1979-01-01

    If it will be possible effectively to reduce the noise level in the accomodation on board ships, by introducing appropriate noise abatement measures already at an early design stage, it is quite essential that sufficiently accurate prediction methods are available for the naval architects...... or for a special noise abatement measure, e.g., increased structural damping. The paper discusses whether it might be possible to derive an alternative calculation model based on the “statistical energy analysis” approach (SEA). By considering the hull of a ship to be constructed from plate elements connected...

  18. A blueprint for complete energy self-sufficiency in British Columbia

    International Nuclear Information System (INIS)

    2007-01-01

    The Endless Energy Project is a partnership between the Globe Foundation, BC Hydro, Day 4 Energy, the Power Technology Alliance, the National Research Council of Canada, and Western Economic Diversification. The purpose of the project is to examine British Columbia's potential to be energy self-sufficient from renewable sources by 2025. Background information on the Endless Energy Project was presented with reference to energy use in all sectors of the economy and energy supply from all sources indigenous to the province. The report discussed global drivers and scenarios as well as energy use trends specific to British Columbia. These trends were related to energy use for residential buildings; commercial sector; domestic transportation; gateway transportation; and industrial sources. The report also provided an outlook for each of these sectors. A large-scale supply outlook was also described for solar; geothermal; wind; hydro; biomass; forest waste to energy potential; ocean wave energy potential; and tidal current systems. The report concluded with a discussion of matching renewable energy supplies to demand. It was concluded that based on a combination of renewable energy supply, cleaner burning fuels, such as hydrogen and ethanol, and energy use reduction in homes, businesses, and public sector operations, British Columbia could reasonably achieve energy self-sufficiency by 2025. tabs., figs

  19. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.; Milligan, Michael; Brinkman, Greg; Bloom, Aaron; Clark, Kara; Denholm, Paul

    2016-12-01

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  20. Statistics Anxiety and Business Statistics: The International Student

    Science.gov (United States)

    Bell, James A.

    2008-01-01

    Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…

  1. Correlation, necessity, and sufficiency: Common errors in the scientific reasoning of undergraduate students for interpreting experiments.

    Science.gov (United States)

    Coleman, Aaron B; Lam, Diane P; Soowal, Lara N

    2015-01-01

    Gaining an understanding of how science works is central to an undergraduate education in biology and biochemistry. The reasoning required to design or interpret experiments that ask specific questions does not come naturally, and is an essential part of the science process skills that must be learned for an understanding of how scientists conduct research. Gaps in these reasoning skills make it difficult for students to become proficient in reading primary scientific literature. In this study, we assessed the ability of students in an upper-division biochemistry laboratory class to use the concepts of correlation, necessity, and sufficiency in interpreting experiments presented in a format and context that is similar to what they would encounter when reading a journal article. The students were assessed before and after completion of a laboratory module where necessary vs. sufficient reasoning was used to design and interpret experiments. The assessment identified two types of errors that were commonly committed by students when interpreting experimental data. When presented with an experiment that only establishes a correlation between a potential intermediate and a known effect, students frequently interpreted the intermediate as being sufficient (causative) for the effect. Also, when presented with an experiment that tests only necessity for an intermediate, they frequently made unsupported conclusions about sufficiency, and vice versa. Completion of the laboratory module and instruction in necessary vs. sufficient reasoning showed some promise for addressing these common errors. © 2015 The International Union of Biochemistry and Molecular Biology.

  2. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  3. Different doses of supplemental vitamin D maintain interleukin-5 without altering skeletal muscle strength: a randomized, double-blind, placebo-controlled study in vitamin D sufficient adults

    Directory of Open Access Journals (Sweden)

    Barker Tyler

    2012-03-01

    Full Text Available Abstract Background Supplemental vitamin D modulates inflammatory cytokines and skeletal muscle function, but results are inconsistent. It is unknown if these inconsistencies are dependent on the supplemental dose of vitamin D. Therefore, the purpose of this study was to identify the influence of different doses of supplemental vitamin D on inflammatory cytokines and muscular strength in young adults. Methods Men (n = 15 and women (n = 15 received a daily placebo or vitamin D supplement (200 or 4000 IU for 28-d during the winter. Serum 25-hydroxyvitamin D (25(OHD, cytokine concentrations and muscular (leg strength measurements were performed prior to and during supplementation. Statistical significance of data were assessed with a two-way (time, treatment analysis of variance (ANOVA with repeated measures, followed by a Tukey's Honestly Significant Difference to test multiple pairwise comparisons. Results Upon enrollment, 63% of the subjects were vitamin D sufficient (serum 25(OHD ≥ 30 ng/ml. Serum 25(OHD and interleukin (IL-5 decreased (P P P P Conclusion In young adults who were vitamin D sufficient prior to supplementation, we conclude that a low-daily dose of supplemental vitamin D prevents serum 25(OHD and IL-5 concentration decreases, and that muscular strength does not parallel the 25(OHD increase induced by a high-daily dose of supplemental vitamin D. Considering that IL-5 protects against viruses and bacterial infections, these findings could have a broad physiological importance regarding the ability of vitamin D sufficiency to mediate the immune systems protection against infection.

  4. Spreadsheets as tools for statistical computing and statistics education

    OpenAIRE

    Neuwirth, Erich

    2000-01-01

    Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.

  5. A Monte Carlo Simulation Comparing the Statistical Precision of Two High-Stakes Teacher Evaluation Methods: A Value-Added Model and a Composite Measure

    Science.gov (United States)

    Spencer, Bryden

    2016-01-01

    Value-added models are a class of growth models used in education to assign responsibility for student growth to teachers or schools. For value-added models to be used fairly, sufficient statistical precision is necessary for accurate teacher classification. Previous research indicated precision below practical limits. An alternative approach has…

  6. A comparison of statistical methods for identifying out-of-date systematic reviews.

    Directory of Open Access Journals (Sweden)

    Porjai Pattanittum

    Full Text Available BACKGROUND: Systematic reviews (SRs can provide accurate and reliable evidence, typically about the effectiveness of health interventions. Evidence is dynamic, and if SRs are out-of-date this information may not be useful; it may even be harmful. This study aimed to compare five statistical methods to identify out-of-date SRs. METHODS: A retrospective cohort of SRs registered in the Cochrane Pregnancy and Childbirth Group (CPCG, published between 2008 and 2010, were considered for inclusion. For each eligible CPCG review, data were extracted and "3-years previous" meta-analyses were assessed for the need to update, given the data from the most recent 3 years. Each of the five statistical methods was used, with random effects analyses throughout the study. RESULTS: Eighty reviews were included in this study; most were in the area of induction of labour. The numbers of reviews identified as being out-of-date using the Ottawa, recursive cumulative meta-analysis (CMA, and Barrowman methods were 34, 7, and 7 respectively. No reviews were identified as being out-of-date using the simulation-based power method, or the CMA for sufficiency and stability method. The overall agreement among the three discriminating statistical methods was slight (Kappa = 0.14; 95% CI 0.05 to 0.23. The recursive cumulative meta-analysis, Ottawa, and Barrowman methods were practical according to the study criteria. CONCLUSION: Our study shows that three practical statistical methods could be applied to examine the need to update SRs.

  7. Statistical mechanical analysis of LMFBR fuel cladding tubes

    International Nuclear Information System (INIS)

    Poncelet, J.-P.; Pay, A.

    1977-01-01

    The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. First, a thermal creep damage index is set up through a sufficiently sophisticated clad physical analysis including arbitrary time dependence of power and neutron flux as well as effects of sodium temperature, burnup and steel mechanical behavior. Although this strain limit approach implies a more general but time consuming model., on the counterpart the net output is improved and e.g. clad temperature, stress and strain maxima may be easily assessed. A full spectrum of variables are statistically treated to account for their probability distributions. Creep damage probability may be obtained and can contribute to a quantitative fuel probability estimation

  8. PENENTUAN CADANGAN PREMI DENGAN METODE PREMIUM SUFFICIENCY PADA ASURANSI JIWA SEUMUR HIDUP JOINT LIFE

    Directory of Open Access Journals (Sweden)

    NI PUTU MIRAH PERMATASARI

    2016-08-01

    Full Text Available The aim of this research was to get the formula of premium reserves through the premium sufficiency method. Premium reserve is the amount of fund that is collected by the insurance company in preparation for the claim’s payment. Premium sufficiency method is gross premium calculation. To construct that formula, this research used Tabel Mortalitas Indonesia (TMI 2011, interest rate 2.5% and cost of alpha %. Based on simulation result in men premium reserve value of age 1 of 56 years propotional with insured periods, but after56 years enhancement of premium reserve value.

  9. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  10. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  11. Energy Strategic Planning & Sufficiency Project

    Energy Technology Data Exchange (ETDEWEB)

    Retziaff, Greg

    2005-03-30

    This report provides information regarding options available, their advantages and disadvantages, and the costs for pursuing activities to advance Smith River Rancheria toward an energy program that reduces their energy costs, allows greater self-sufficiency and stimulates economic development and employment opportunities within and around the reservation. The primary subjects addressed in this report are as follows: (1) Baseline Assessment of Current Energy Costs--An evaluation of the historical energy costs for Smith River was conducted to identify the costs for each component of their energy supply to better assess changes that can be considered for energy cost reductions. (2) Research Viable Energy Options--This includes a general description of many power generation technologies and identification of their relative costs, advantages and disadvantages. Through this research the generation technology options that are most suited for this application were identified. (3) Project Development Considerations--The basic steps and associated challenges of developing a generation project utilizing the selected technologies are identified and discussed. This included items like selling to third parties, wheeling, electrical interconnections, fuel supply, permitting, standby power, and transmission studies. (4) Energy Conservation--The myriad of federal, state and utility programs offered for low-income weatherization and utility bill payment assistance are identified, their qualification requirements discussed, and the subsequent benefits outlined. (5) Establishing an Energy Organization--The report includes a high level discussion of formation of a utility to serve the Tribal membership. The value or advantages of such action is discussed along with some of the challenges. (6) Training--Training opportunities available to the Tribal membership are identified.

  12. Food Self-Sufficiency across scales: How local can we go?

    Science.gov (United States)

    Pradhan, Prajal; Lüdeke, Matthias K. B.; Reusser, Dominik E.; Kropp, Jürgen P.

    2013-04-01

    "Think global, act local" is a phrase often used in sustainability debates. Here, we explore the potential of regions to go for local supply in context of sustainable food consumption considering both the present state and the plausible future scenarios. We analyze data on the gridded crop calories production, the gridded livestock calories production, the gridded feed calories use and the gridded food calories consumption in 5' resolution. We derived these gridded data from various sources: Global Agro-ecological Zone (GAEZ v3.0), Gridded Livestock of the World (GLW), FAOSTAT, and Global Rural-Urban Mapping Project (GRUMP). For scenarios analysis, we considered changes in population, dietary patterns and possibility of obtaining the maximum potential yield. We investigate the food self-sufficiency multiple spatial scales. We start from the 5' resolution (i.e. around 10 km x 10 km in the equator) and look at 8 levels of aggregation ranging from the plausible lowest administrative level to the continental level. Results for the different spatial scales show that about 1.9 billion people live in the area of 5' resolution where enough calories can be produced to sustain their food consumption and the feed used. On the country level, about 4.4 billion population can be sustained without international food trade. For about 1 billion population from Asia and Africa, there is a need for cross-continental food trade. However, if we were able to achieve the maximum potential crop yield, about 2.6 billion population can be sustained within their living area of 5' resolution. Furthermore, Africa and Asia could be food self-sufficient by achieving their maximum potential crop yield and only round 630 million populations would be dependent on the international food trade. However, the food self-sufficiency status might differ under consideration of the future change in population, dietary patterns and climatic conditions. We provide an initial approach for investigating the

  13. High spatial validity is not sufficient to elicit voluntary shifts of attention.

    Science.gov (United States)

    Pauszek, Joseph R; Gibson, Bradley S

    2016-10-01

    Previous research suggests that the use of valid symbolic cues is sufficient to elicit voluntary shifts of attention. The present study interpreted this previous research within a broader theoretical context which contends that observers will voluntarily use symbolic cues to orient their attention in space when the temporal costs of using the cues are perceived to be less than the temporal costs of searching without the aid of the cues. In this view, previous research has not addressed the sufficiency of valid symbolic cues, because the temporal cost of using the cues is usually incurred before the target display appears. To address this concern, 70%-valid spatial word cues were presented simultaneously with a search display. In addition, other research suggests that opposing cue-dependent and cue-independent spatial biases may operate in these studies and alter standard measures of orienting. After identifying and controlling these opposing spatial biases, the results of two experiments showed that the word cues did not elicit voluntary shifts of attention when the search task was relatively easy but did when the search task was relatively difficult. Moreover, the findings also showed that voluntary use of the word cues changed over the course of the experiment when the task was difficult, presumably because the temporal cost of searching without the cue lessened as the task got easier with practice. Altogether, the present findings suggested that the factors underlying voluntary control are multifaceted and contextual, and that spatial validity alone is not sufficient to elicit voluntary shifts of attention.

  14. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Science.gov (United States)

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  15. Cloud-Based Parameter-Driven Statistical Services and Resource Allocation in a Heterogeneous Platform on Enterprise Environment

    Directory of Open Access Journals (Sweden)

    Sungju Lee

    2016-09-01

    Full Text Available A fundamental key for enterprise users is a cloud-based parameter-driven statistical service and it has become a substantial impact on companies worldwide. In this paper, we demonstrate the statistical analysis for some certain criteria that are related to data and applied to the cloud server for a comparison of results. In addition, we present a statistical analysis and cloud-based resource allocation method for a heterogeneous platform environment by performing a data and information analysis with consideration of the application workload and the server capacity, and subsequently propose a service prediction model using a polynomial regression model. In particular, our aim is to provide stable service in a given large-scale enterprise cloud computing environment. The virtual machines (VMs for cloud-based services are assigned to each server with a special methodology to satisfy the uniform utilization distribution model. It is also implemented between users and the platform, which is a main idea of our cloud computing system. Based on the experimental results, we confirm that our prediction model can provide sufficient resources for statistical services to large-scale users while satisfying the uniform utilization distribution.

  16. A Sufficient Condition for an Interval Matrix to have Full Column Rank

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2017-01-01

    Roč. 22, č. 2 (2017), s. 59-66 ISSN 1560-7534 Institutional support: RVO:67985807 Keywords : interval matrix * full column rank * sufficient condition * double condition Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics http://www.ict.nsc.ru/jct/annotation/1779?l=eng

  17. Understanding Statistics and Statistics Education: A Chinese Perspective

    Science.gov (United States)

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  18. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  19. Financial Consumer Protection in the EU : Towards a Self-Sufficient European Contract Law for Consumer Financial Services?

    NARCIS (Netherlands)

    Cherednychenko, O.O.

    2014-01-01

    The rapid expansion of European contract law in the field of consumer financial services gives rise to the question to what extent it is self-sufficient. A self-sufficient European contract law presupposes the existence of an EU-made and EU-enforced contract-related legal order which is largely

  20. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  1. Sonic hedgehog in the notochord is sufficient for patterning of the intervertebral discs.

    Science.gov (United States)

    Choi, Kyung-Suk; Lee, Chanmi; Harfe, Brian D

    2012-01-01

    The intervertebral discs, located between adjacent vertebrae, are required for stability of the spine and distributing mechanical load throughout the vertebral column. All cell types located in the middle regions of the discs, called nuclei pulposi, are derived from the embryonic notochord. Recently, it was shown that the hedgehog signaling pathway plays an essential role during formation of nuclei pulposi. However, during the time that nuclei pulposi are forming, Shh is expressed in both the notochord and the nearby floor plate. To determine the source of SHH protein sufficient for formation of nuclei pulposi we removed Shh from either the floor plate or the notochord using tamoxifen-inducible Cre alleles. Removal of Shh from the floor plate resulted in phenotypically normal intervertebral discs, indicating that Shh expression in this tissue is not required for disc patterning. In addition, embryos that lacked Shh in the floor plate had normal vertebral columns, demonstrating that Shh expression in the notochord is sufficient for pattering the entire vertebral column. Removal of Shh from the notochord resulted in the absence of Shh in the floor plate, loss of intervertebral discs and vertebral structures. These data indicate that Shh expression in the notochord is sufficient for patterning of the intervertebral discs and the vertebral column. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Statistical distribution of hydrogen over three positions in the brucite Mg(OH)2 structure from electron diffractometry data

    International Nuclear Information System (INIS)

    Zhukhlistov, A.A.; Avilov, A.S.; Ferraris, D.; Zvyagin, B.B.; Plotnikov, V.P.

    1997-01-01

    The method of improved automatic electron diffractometry for measuring and recording intensities to two-dimensionally distributed reflections of texture-type electron diffraction patterns has been used for the analysis of the brucite Mg(OH) 2 structure. The experimental accuracy of the measured intensities proved to be sufficient for studying fine structural details of the statistical distribution of hydrogen atoms over three structure positions located around the threefold axis of the brucite structure

  3. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  4. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  5. [Effect of vitamin beverages on vitamin sufficiency of the workers of Pskov Hydroelectric Power-Plant].

    Science.gov (United States)

    Spiricheva, T V; Vrezhesinskaia, O A; Beketova, N A; Pereverzeva, O G; Kosheleva, O V; Kharitonchik, L A; Kodentsova, V M; Iudina, A V; Spirichev, V B

    2010-01-01

    The research of influence of vitamin complexes in the form of a drink or kissel on vitamin sufficiency of working persons has been carried out. Long inclusion (6,5 months) in a diet of vitamin drinks containing about 80% from recommended daily consumption of vitamins, was accompanied by trustworthy improvement of vitamins C and B6 sufficiency and prevention of seasonal deterioration of beta-carotene status. As initially surveyed have been well provided with vitamins A and E, their blood serum level increase had not occurred.

  6. Economic efficiency or self-sufficiency: alternative strategies for oil consumers?

    International Nuclear Information System (INIS)

    Heal, D.W.

    1992-01-01

    The ideal energy source is low cost (efficient) and reliable (secure). The high price and perceived political unreliability of Middle East oil supplies prompted a nearly worldwide trend towards energy self-sufficiency. Gains in energy efficiency, which have been most marked in the OECD, are permanent and, prompted by environmental concern, probably progressive. But the opportunity that is still available to low cost oil suppliers to regain lost markets will only be realized if those supplies are demonstrably reliable. (author)

  7. Classification of Underlying Causes of Power Quality Disturbances: Deterministic versus Statistical Methods

    Directory of Open Access Journals (Sweden)

    Emmanouil Styvaktakis

    2007-01-01

    Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.

  8. The usefulness of descriptive statistics in the interpretation of data on occupational physical activity of Poles

    Directory of Open Access Journals (Sweden)

    Elżbieta Biernat

    2014-12-01

    Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753

  9. Countrywide Evaluation of the Long-Term Family Self-Sufficiency Plan. Establishing the Baselines

    National Research Council Canada - National Science Library

    Schoeni, Robert

    2002-01-01

    ...) Plan on November 16,1999. The LTFSS Plan consists of 46 projects whose goal is to promote self-sufficiency among families that are participating in the California Work Opportunity and Responsibility to Kids (CalWORKs...

  10. On the Necessary and Sufficient Assumptions for UC Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2010-01-01

    for all of them. Perhaps most interestingly we show that: •  For even the minimal meaningful KRA, where we only assume that the secret key is a value which is hard to compute from the public key, one can UC securely compute any poly-time functionality if there exists a passive secure oblivious...... that in the KRA model one-way functions are sufficient for UC commitment and UC zero-knowledge. These are the first examples of UC secure protocols for non-trivial tasks which do not assume the existence of public-key primitives. In particular, the protocols show that non-trivial UC computation is possible...

  11. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  12. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  13. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  14. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  15. Influence of Hydrogen-Based Storage Systems on Self-Consumption and Self-Sufficiency of Residential Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Christian Pötzinger

    2015-08-01

    Full Text Available This paper analyzes the behavior of residential solar-powered electrical energy storage systems. For this purpose, a simulation model based on MATLAB/Simulink is developed. Investigating both short-time and seasonal hydrogen-based storage systems, simulations on the basis of real weather data are processed on a timescale of 15 min for a consideration period of 3 years. A sensitivity analysis is conducted in order to identify the most important system parameters concerning the proportion of consumption and the degree of self-sufficiency. Therefore, the influences of storage capacity and of storage efficiencies are discussed. A short-time storage system can increase the proportion of consumption by up to 35 percentage points compared to a self-consumption system without storage. However, the seasonal storing system uses almost the entire energy produced by the photovoltaic (PV system (nearly 100% self-consumption. Thereby, the energy drawn from the grid can be reduced and a degree of self-sufficiency of about 90% is achieved. Based on these findings, some scenarios to reach self-sufficiency are analyzed. The results show that full self-sufficiency will be possible with a seasonal hydrogen-based storage system if PV area and initial storage level are appropriate.

  16. Fermi-Dirac statistics and traffic in complex networks.

    Science.gov (United States)

    de Moura, Alessandro P S

    2005-06-01

    We propose an idealized model for traffic in a network, in which many particles move randomly from node to node, following the network's links, and it is assumed that at most one particle can occupy any given node. This is intended to mimic the finite forwarding capacity of nodes in communication networks, thereby allowing the possibility of congestion and jamming phenomena. We show that the particles behave like free fermions, with appropriately defined energy-level structure and temperature. The statistical properties of this system are thus given by the corresponding Fermi-Dirac distribution. We use this to obtain analytical expressions for dynamical quantities of interest, such as the mean occupation of each node and the transport efficiency, for different network topologies and particle densities. We show that the subnetwork of free nodes always fragments into small isolated clusters for a sufficiently large number of particles, implying a communication breakdown at some density for all network topologies. These results are compared to direct simulations.

  17. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gallo, Giulia [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Milligan, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-06-01

    Revenue insufficiency, or the missing money problem, occurs when the revenues that generators earn from the market are not sufficient to cover both fixed and variable costs to remain in the market and/or justify investments in new capacity, which may be needed for reliability. The near-zero marginal cost of variable renewable generators further exacerbates these revenue challenges. Estimating the extent of the missing money problem in current electricity markets is an important, nontrivial task that requires representing both how the power system operates and how market participants behave. This paper explores the missing money problem using a production cost model that represented a simplified version of the Electric Reliability Council of Texas (ERCOT) energy-only market for the years 2012-2014. We evaluate how various market structures -- including market behavior, ancillary services, and changing fleet compositions -- affect net revenues in this ERCOT-like system. In most production cost modeling exercises, resources are assumed to offer their marginal capabilities at marginal costs. Although this assumption is reasonable for feasibility studies and long-term planning, it does not adequately consider the market behaviors that impact revenue sufficiency. In this work, we simulate a limited set of market participant strategic bidding behaviors by means of different sets of markups; these markups are applied to the true production costs of all gas generators, which are the most prominent generators in ERCOT. Results show that markups can help generators increase their net revenues overall, although net revenues may increase or decrease depending on the technology and the year under study. Results also confirm that conventional, variable-cost-based production cost simulations do not capture prices accurately, and this particular feature calls for proxies for strategic behaviors (e.g., markups) and more accurate representations of how electricity markets work. The

  18. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  19. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  20. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    Science.gov (United States)

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  2. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  3. Scalar energy fluctuations in Large-Eddy Simulation of turbulent flames: Statistical budgets and mesh quality criterion

    Energy Technology Data Exchange (ETDEWEB)

    Vervisch, Luc; Domingo, Pascale; Lodato, Guido [CORIA - CNRS and INSA de Rouen, Technopole du Madrillet, BP 8, 76801 Saint-Etienne-du-Rouvray (France); Veynante, Denis [EM2C - CNRS and Ecole Centrale Paris, Grande Voie des Vignes, 92295 Chatenay-Malabry (France)

    2010-04-15

    Large-Eddy Simulation (LES) provides space-filtered quantities to compare with measurements, which usually have been obtained using a different filtering operation; hence, numerical and experimental results can be examined side-by-side in a statistical sense only. Instantaneous, space-filtered and statistically time-averaged signals feature different characteristic length-scales, which can be combined in dimensionless ratios. From two canonical manufactured turbulent solutions, a turbulent flame and a passive scalar turbulent mixing layer, the critical values of these ratios under which measured and computed variances (resolved plus sub-grid scale) can be compared without resorting to additional residual terms are first determined. It is shown that actual Direct Numerical Simulation can hardly accommodate a sufficiently large range of length-scales to perform statistical studies of LES filtered reactive scalar-fields energy budget based on sub-grid scale variances; an estimation of the minimum Reynolds number allowing for such DNS studies is given. From these developments, a reliability mesh criterion emerges for scalar LES and scaling for scalar sub-grid scale energy is discussed. (author)

  4. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  5. Statistical methods in the mechanical design of fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Radsak, C.; Streit, D.; Muench, C.J. [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    The mechanical design of a fuel assembly is still being mainly performed in a de terministic way. This conservative approach is however not suitable to provide a realistic quantification of the design margins with respect to licensing criter ia for more and more demanding operating conditions (power upgrades, burnup increase,..). This quantification can be provided by statistical methods utilizing all available information (e.g. from manufacturing, experience feedback etc.) of the topic under consideration. During optimization e.g. of the holddown system certain objectives in the mechanical design of a fuel assembly (FA) can contradict each other, such as sufficient holddown forces enough to prevent fuel assembly lift-off and reducing the holddown forces to minimize axial loads on the fuel assembly structure to ensure no negative effect on the control rod movement.By u sing a statistical method the fuel assembly design can be optimized much better with respect to these objectives than it would be possible based on a deterministic approach. This leads to a more realistic assessment and safer way of operating fuel assemblies. Statistical models are defined on the one hand by the quanti le that has to be maintained concerning the design limit requirements (e.g. one FA quantile) and on the other hand by the confidence level which has to be met. Using the above example of the holddown force, a feasible quantile can be define d based on the requirement that less than one fuel assembly (quantile > 192/19 3 [%] = 99.5 %) in the core violates the holddown force limit w ith a confidence of 95%. (orig.)

  6. Nature exposure sufficiency and insufficiency: The benefits of environmental preservation.

    Science.gov (United States)

    Reddon, John R; Durante, Salvatore B

    2018-01-01

    Increasing industrialization, urbanization, and a failure of many world leaders to appreciate the consequences of climate change are deleteriously impacting quality of life as well as diminishing the prospects for long term survival. Economic competitiveness and corporate profitability often pre-empt environmental concerns. The calving of an iceberg in Antarctica and the hurricane activity in the Caribbean during 2017 are unfortunate illustrations of the continuing escalation of environmental issues. We provide historical and current evidence for the importance of Nature Exposure (NE) and introduce the continuum Nature Exposure Sufficiency (NES) and Insufficiency (NEI). Insufficiency includes impoverished environments (e.g., slums and prisons) where nature exposure is very limited. Nature Exposure Sufficiency (NES) is an optimal amount of exposure to nature where many benefits such as reinvigoration can be obtained by everyone. NES also has several benefits for individuals with various health conditions such as arthritis, dementia, or depression. The benefits of NE are not just derivable from parks, forests, and other natural settings. Interiors of buildings and homes can be enhanced with plants and even pictures or objects from nature. Additionally, there is abundant evidence indicating that virtual and artificial environments depicting nature can provide substantial NE and therefore contribute to general wellbeing. Besides the difficulty in achieving cooperation amongst nations, corporations, and other collectives in developing and implementing long range plans to deal with climate change, there is also sometimes an aversion at the individual level whereby people are unwilling to experience nature due to insects and other discomforts. Such individuals are often averse to supplanting the comforts of home, even temporarily, with inadequate facilities that are seemingly less pleasant than their typical dwellings. We propose using the term Nature Exposure Aversion

  7. 76 FR 39115 - Notice of Proposed Information Collection: Transformation Initiative Family Self-Sufficiency...

    Science.gov (United States)

    2011-07-05

    ... Information Collection: Transformation Initiative Family Self-Sufficiency Demonstration Small Grants AGENCY... information: Title of Proposal: Notice of Funding Availability for the Transformation Initiative Family Self..., think tanks, consortia, Institutions of higher education accredited by a national or regional...

  8. Artificial Self-Sufficient P450 in Reversed Micelles

    Directory of Open Access Journals (Sweden)

    Teruyuki Nagamune

    2010-04-01

    Full Text Available Cytochrome P450s are heme-containing monooxygenases that require electron transfer proteins for their catalytic activities. They prefer hydrophobic compounds as substrates and it is, therefore, desirable to perform their reactions in non-aqueous media. Reversed micelles can stably encapsulate proteins in nano-scaled water pools in organic solvents. However, in the reversed micellar system, when multiple proteins are involved in a reaction they can be separated into different micelles and it is then difficult to transfer electrons between proteins. We show here that an artificial self-sufficient cytochrome P450, which is an enzymatically crosslinked fusion protein composed of P450 and electron transfer proteins, showed micelle-size dependent catalytic activity in a reversed micellar system. Furthermore, the presence of thermostable alcohol dehydrogenase promoted the P450-catalyzed reaction due to cofactor regeneration.

  9. Navigating behavioral energy sufficiency. Results from a survey in Swiss cities on potential behavior change.

    Science.gov (United States)

    Seidl, Roman; Moser, Corinne; Blumer, Yann

    2017-01-01

    Many countries have some kind of energy-system transformation either planned or ongoing for various reasons, such as to curb carbon emissions or to compensate for the phasing out of nuclear energy. One important component of these transformations is the overall reduction in energy demand. It is generally acknowledged that the domestic sector represents a large share of total energy consumption in many countries. Increased energy efficiency is one factor that reduces energy demand, but behavioral approaches (known as "sufficiency") and their respective interventions also play important roles. In this paper, we address citizens' heterogeneity regarding both their current behaviors and their willingness to realize their sufficiency potentials-that is, to reduce their energy consumption through behavioral change. We collaborated with three Swiss cities for this study. A survey conducted in the three cities yielded thematic sets of energy-consumption behavior that various groups of participants rated differently. Using this data, we identified four groups of participants with different patterns of both current behaviors and sufficiency potentials. The paper discusses intervention types and addresses citizens' heterogeneity and behaviors from a city-based perspective.

  10. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  11. An immobilized and highly stabilized self-sufficient monooxygenase as biocatalyst for oxidative biotransformations

    NARCIS (Netherlands)

    Valencia, Daniela; Guillén, Marina; Fürst, Maximilian; Josep, López-Santín; Álvaro, Gregorio

    BACKGROUND The requirement of expensive cofactors that must be efficiently recycled is one of the major factors hindering the wide implementation of industrial biocatalytic oxidation processes. In this research, a sustainable approach based on immobilized self-sufficient Baeyer-Villiger

  12. Statistical study of undulator radiated power by a classical detection system in the mm-wave regime

    Directory of Open Access Journals (Sweden)

    A. Eliran

    2009-05-01

    Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.

  13. Statistical prediction of immunity to placental malaria based on multi-assay antibody data for malarial antigens

    DEFF Research Database (Denmark)

    Siriwardhana, Chathura; Fang, Rui; Salanti, Ali

    2017-01-01

    Background Plasmodium falciparum infections are especially severe in pregnant women because infected erythrocytes (IE) express VAR2CSA, a ligand that binds to placental trophoblasts, causing IE to accumulate in the placenta. Resulting inflammation and pathology increases a woman’s risk of anemia...... to 28 malarial antigens and used the data to develop statistical models for predicting if a woman has sufficient immunity to prevent PM. Methods Archival plasma samples from 1377 women were screened in a bead-based multiplex assay for Ab to 17 VAR2CSA-associated antigens (full length VAR2CSA (FV2), DBL...... in the following seven statistical approaches: logistic regression full model, logistic regression reduced model, recursive partitioning, random forests, linear discriminant analysis, quadratic discriminant analysis, and support vector machine. Results The best and simplest model proved to be the logistic...

  14. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  15. Necessary and sufficient conditions for the quantum Zeno and anti-Zeno effect

    International Nuclear Information System (INIS)

    Atmanspacher, Harald; Ehm, Werner; Gneiting, Tilmann

    2003-01-01

    A necessary and sufficient condition for the occurrence of the quantum Zeno effect is given, refining a recent conjecture of Luo, Wang and Zhang. An analogous condition is derived for the quantum anti-Zeno effect. Both results rely on a formal connection between the quantum (anti-)Zeno effect and the weak law of large numbers

  16. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  17. AN ALTERNATIVE APPROACH TO MEET BEEF SELF-SUFFICIENCY IN WEST PAPUA

    Directory of Open Access Journals (Sweden)

    S. Hartono

    2011-09-01

    Full Text Available The main objective of this research is to present an alternative approach to meet beef self-sufficiency in West Papua, Indonesia. It mainly focuses on calculating the needed number of productive cows to enhance beef production in the province. Out of the total farmer households in Manokwari, Indonesia, 189 farmer-respondents were selected as samples of the study. Selection of the sample was based on the number of cattle kept in every age group (less than one (2 years old and the number of productive cows. Secondary data came from the time series data of the number of slaughtered cattle vis-à-vis the population of all districts in West Papua Province from 1980-2008. Data were analyzed using the Partial Adjustment Model (PAM and Ordinary Least Square (OLS method. Results of the study showed that beef self-sufficiency in West Papua depend on the availability of the number of productive cows to produce ready-slaughtered-bull in the previous year. Particularly for West Papua, to produce one unit of bull in the tth –year, with the assumption that cattle mortality is 4.92%, a number of 2.38 animal units AU of productive cows must be provided in the previous two (2 years.

  18. Attended but unseen: visual attention is not sufficient for visual awareness.

    Science.gov (United States)

    Kentridge, R W; Nijboer, T C W; Heywood, C A

    2008-02-12

    Does any one psychological process give rise to visual awareness? One candidate is selective attention-when we attend to something it seems we always see it. But if attention can selectively enhance our response to an unseen stimulus then attention cannot be a sufficient precondition for awareness. Kentridge, Heywood & Weiskrantz [Kentridge, R. W., Heywood, C. A., & Weiskrantz, L. (1999). Attention without awareness in blindsight. Proceedings of the Royal Society of London, Series B, 266, 1805-1811; Kentridge, R. W., Heywood, C. A., & Weiskrantz, L. (2004). Spatial attention speeds discrimination without awareness in blindsight. Neuropsychologia, 42, 831-835.] demonstrated just such a dissociation in the blindsight subject GY. Here, we test whether the dissociation generalizes to the normal population. We presented observers with pairs of coloured discs, each masked by the subsequent presentation of a coloured annulus. The discs acted as primes, speeding discrimination of the colour of the annulus when they matched in colour and slowing it when they differed. We show that the location of attention modulated the size of this priming effect. However, the primes were rendered invisible by metacontrast-masking and remained unseen despite being attended. Visual attention could therefore facilitate processing of an invisible target and cannot, therefore, be a sufficient precondition for visual awareness.

  19. Quantitative analysis of CT brain images: a statistical model incorporating partial volume and beam hardening effects

    International Nuclear Information System (INIS)

    McLoughlin, R.F.; Ryan, M.V.; Heuston, P.M.; McCoy, C.T.; Masterson, J.B.

    1992-01-01

    The purpose of this study was to construct and evaluate a statistical model for the quantitative analysis of computed tomographic brain images. Data were derived from standard sections in 34 normal studies. A model representing the intercranial pure tissue and partial volume areas, with allowance for beam hardening, was developed. The average percentage error in estimation of areas, derived from phantom tests using the model, was 28.47%. We conclude that our model is not sufficiently accurate to be of clinical use, even though allowance was made for partial volume and beam hardening effects. (author)

  20. Necessary and Sufficient Conditions for Pareto Optimality in Infinite Horizon Cooperative Differential Games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2011-01-01

    In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for infinite horizon cooperative differential games. We consider games defined by non autonomous and discounted autonomous systems. The obtained results are used to analyze the regular

  1. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  2. Statistical Mechanics of Turbulent Dynamos

    Science.gov (United States)

    Shebalin, John V.

    2014-01-01

    Incompressible magnetohydrodynamic (MHD) turbulence and magnetic dynamos, which occur in magnetofluids with large fluid and magnetic Reynolds numbers, will be discussed. When Reynolds numbers are large and energy decays slowly, the distribution of energy with respect to length scale becomes quasi-stationary and MHD turbulence can be described statistically. In the limit of infinite Reynolds numbers, viscosity and resistivity become zero and if these values are used in the MHD equations ab initio, a model system called ideal MHD turbulence results. This model system is typically confined in simple geometries with some form of homogeneous boundary conditions, allowing for velocity and magnetic field to be represented by orthogonal function expansions. One advantage to this is that the coefficients of the expansions form a set of nonlinearly interacting variables whose behavior can be described by equilibrium statistical mechanics, i.e., by a canonical ensemble theory based on the global invariants (energy, cross helicity and magnetic helicity) of ideal MHD turbulence. Another advantage is that truncated expansions provide a finite dynamical system whose time evolution can be numerically simulated to test the predictions of the associated statistical mechanics. If ensemble predictions are the same as time averages, then the system is said to be ergodic; if not, the system is nonergodic. Although it had been implicitly assumed in the early days of ideal MHD statistical theory development that these finite dynamical systems were ergodic, numerical simulations provided sufficient evidence that they were, in fact, nonergodic. Specifically, while canonical ensemble theory predicted that expansion coefficients would be (i) zero-mean random variables with (ii) energy that decreased with length scale, it was found that although (ii) was correct, (i) was not and the expected ergodicity was broken. The exact cause of this broken ergodicity was explained, after much

  3. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  4. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  5. A cross-sectional pilot study to examine food sufficiency and assess nutrition among low-income patients with injection-related venous ulcers.

    Science.gov (United States)

    Pieper, Barbara; Templin, Thomas N

    2015-04-01

    Adequate nutrition has long been considered a critical component for wound healing, but literature regarding the relationship between nutrition and venous ulcer (VU) healing is limited. A person's nutrition is affected by the availability of food as well as his/her overall health. Food sufficiency and nutrition are important concerns in the care of persons of low income with injection-related VUs, which tend to be large and slow to heal. A cross-sectional pilot study was conducted to explore the relationship between food sufficiency/security and nutrition with regard to demographic, wound, quality-of-life, physical activity, falls, and fall risk variables. Nutrition was examined using 2 well-developed instruments that measure food sufficiency/security and assess nutrition--the United States Department of Agriculture's Adult Food Sufficiency Questionnaire (FSQ) and the Nestle Mini Nutritional Assessment (MNA). All participants (N = 31, 54% men, mean age 56.1 ± 3.6 years, all African American) were recruited from an outpatient clinic. All had injection-related VUs from a history of injecting illicit substances. In terms of food sufficiency/security, most participants (26, 84%) reported having enough food in the house, but 10 (32%) worried about running out of food. From 16% to 22.6% of participants expressed concern with food sufficiency/security in terms of cutting meal size, eating less, hunger, and weight loss. Food sufficiency/security was high for 19 (61.3%), but 12 (39%) had marginal or lower food sufficiency/security. MNA scores showed 16 participants (52%) were at risk of malnutrition or malnourished. Low food sufficiency/security was significantly (P nutrition assessment scores were significantly associated (P nutrition assessment are important to assess in low-income persons with injection-related VUs. A number of significant relationships of the FSQ and MNA to other variables was found but needs further investigation with a larger sample.

  6. 49 CFR 40.263 - What happens when an employee is unable to provide a sufficient amount of saliva for an alcohol...

    Science.gov (United States)

    2010-10-01

    ... sufficient amount of saliva for an alcohol screening test? (a) As the STT, you must take the following steps if an employee is unable to provide sufficient saliva to complete a test on a saliva screening device (e.g., the employee does not provide sufficient saliva to activate the device). (1) You must conduct...

  7. Statistical distribution for generalized ideal gas of fractional-statistics particles

    International Nuclear Information System (INIS)

    Wu, Y.

    1994-01-01

    We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed

  8. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  9. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  10. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  11. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  12. Mating motives are neither necessary nor sufficient to create the beauty premium.

    Science.gov (United States)

    Hafenbrädl, Sebastian; Dana, Jason

    2017-01-01

    Mating motives lead decision makers to favor attractive people, but this favoritism is not sufficient to create a beauty premium in competitive settings. Further, economic approaches to discrimination, when correctly characterized, could neatly accommodate the experimental and field evidence of a beauty premium. Connecting labor economics and evolutionary psychology is laudable, but mating motives do not explain the beauty premium.

  13. Sufficient Descent Conjugate Gradient Methods for Solving Convex Constrained Nonlinear Monotone Equations

    Directory of Open Access Journals (Sweden)

    San-Yang Liu

    2014-01-01

    Full Text Available Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations.

  14. Hydrogen scrambling in ethane induced by intense laser fields: statistical analysis of coincidence events.

    Science.gov (United States)

    Kanya, Reika; Kudou, Tatsuya; Schirmel, Nora; Miura, Shun; Weitzel, Karl-Michael; Hoshina, Kennosuke; Yamanouchi, Kaoru

    2012-05-28

    Two-body Coulomb explosion processes of ethane (CH(3)CH(3)) and its isotopomers (CD(3)CD(3) and CH(3)CD(3)) induced by an intense laser field (800 nm, 1.0 × 10(14) W/cm(2)) with three different pulse durations (40 fs, 80 fs, and 120 fs) are investigated by a coincidence momentum imaging method. On the basis of statistical treatment of the coincidence data, the contributions from false coincidence events are estimated and the relative yields of the decomposition pathways are determined with sufficiently small uncertainties. The branching ratios of the two body decomposition pathways of CH(3)CD(3) from which triatomic hydrogen molecular ions (H(3)(+), H(2)D(+), HD(2)(+), D(3)(+)) are ejected show that protons and deuterons within CH(3)CD(3) are scrambled almost statistically prior to the ejection of a triatomic hydrogen molecular ion. The branching ratios were estimated by statistical Rice-Ramsperger-Kassel-Marcus calculations by assuming a transition state with a hindered-rotation of a diatomic hydrogen moiety. The hydrogen scrambling dynamics followed by the two body decomposition processes are discussed also by using the anisotropies in the ejection directions of the fragment ions and the kinetic energy distribution of the two body decomposition pathways.

  15. A Nineteenth Century Statistical Society that Abandoned Statistics

    NARCIS (Netherlands)

    Stamhuis, I.H.

    2007-01-01

    In 1857, a Statistical Society was founded in the Netherlands. Within this society, statistics was considered a systematic, quantitative, and qualitative description of society. In the course of time, the society attracted a wide and diverse membership, although the number of physicians on its rolls

  16. TGF-β Signaling Is Necessary and Sufficient for Pharyngeal Arch Artery Angioblast Formation

    Directory of Open Access Journals (Sweden)

    Maryline Abrial

    2017-07-01

    Full Text Available The pharyngeal arch arteries (PAAs are transient embryonic blood vessels that mature into critical segments of the aortic arch and its branches. Although defects in PAA development cause life-threating congenital cardiovascular defects, the molecular mechanisms that orchestrate PAA morphogenesis remain unclear. Through small-molecule screening in zebrafish, we identified TGF-β signaling as indispensable for PAA development. Specifically, chemical inhibition of the TGF-β type I receptor ALK5 impairs PAA development because nkx2.5+ PAA progenitor cells fail to differentiate into tie1+ angioblasts. Consistent with this observation, we documented a burst of ALK5-mediated Smad3 phosphorylation within PAA progenitors that foreshadows angioblast emergence. Remarkably, premature induction of TGF-β receptor activity stimulates precocious angioblast differentiation, thereby demonstrating the sufficiency of this pathway for initiating the PAA progenitor to angioblast transition. More broadly, these data uncover TGF-β as a rare signaling pathway that is necessary and sufficient for angioblast lineage commitment.

  17. Flow prediction models using macroclimatic variables and multivariate statistical techniques in the Cauca River Valley

    International Nuclear Information System (INIS)

    Carvajal Escobar Yesid; Munoz, Flor Matilde

    2007-01-01

    The project this centred in the revision of the state of the art of the ocean-atmospheric phenomena that you affect the Colombian hydrology especially The Phenomenon Enos that causes a socioeconomic impact of first order in our country, it has not been sufficiently studied; therefore it is important to approach the thematic one, including the variable macroclimates associated to the Enos in the analyses of water planning. The analyses include revision of statistical techniques of analysis of consistency of hydrological data with the objective of conforming a database of monthly flow of the river reliable and homogeneous Cauca. Statistical methods are used (Analysis of data multivariante) specifically The analysis of principal components to involve them in the development of models of prediction of flows monthly means in the river Cauca involving the Lineal focus as they are the model autoregressive AR, ARX and Armax and the focus non lineal Net Artificial Network.

  18. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  19. The necessary and sufficient conditions of therapeutic personality change: Reactions to Rogers' 1957 article.

    Science.gov (United States)

    Samstag, Lisa Wallner

    2007-09-01

    Carl Rogers' article (see record 2007-14639-002) on the necessary and sufficient conditions for personality change has had a significant impact on the field of psychotherapy and psychotherapy research. He emphasized the client as arbiter of his or her own subjective experience and tested his hypothesized therapist-offered conditions of change using recorded sessions. This aided in demystifying the therapeutic process and led to a radical shift in the listening stance of the therapist. I briefly outline my views regarding the influence of the ideas presented in this work, describe the intellectual and cultural context of the times, and discuss a number of ways in which the therapist-offered conditions for psychological transformation are neither necessary nor sufficient. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  20. Arterial stiffening provides sufficient explanation for primary hypertension.

    Directory of Open Access Journals (Sweden)

    Klas H Pettersen

    2014-05-01

    Full Text Available Hypertension is one of the most common age-related chronic disorders, and by predisposing individuals for heart failure, stroke, and kidney disease, it is a major source of morbidity and mortality. Its etiology remains enigmatic despite intense research efforts over many decades. By use of empirically well-constrained computer models describing the coupled function of the baroreceptor reflex and mechanics of the circulatory system, we demonstrate quantitatively that arterial stiffening seems sufficient to explain age-related emergence of hypertension. Specifically, the empirically observed chronic changes in pulse pressure with age and the impaired capacity of hypertensive individuals to regulate short-term changes in blood pressure arise as emergent properties of the integrated system. The results are consistent with available experimental data from chemical and surgical manipulation of the cardio-vascular system. In contrast to widely held opinions, the results suggest that primary hypertension can be attributed to a mechanogenic etiology without challenging current conceptions of renal and sympathetic nervous system function.

  1. Necessary and sufficient liveness condition of GS3PR Petri nets

    Science.gov (United States)

    Liu, GaiYun; Barkaoui, Kamel

    2015-05-01

    Structural analysis is one of the most important and efficient methods to investigate the behaviour of Petri nets. Liveness is a significant behavioural property of Petri nets. Siphons, as structural objects of a Petri net, are closely related to its liveness. Many deadlock control policies for flexible manufacturing systems (FMS) modelled by Petri nets are implemented via siphon control. Most of the existing methods design liveness-enforcing supervisors by adding control places for siphons based on their controllability conditions. To compute a liveness-enforcing supervisor with as much as permissive behaviour, it is both theoretically and practically significant to find an exact controllability condition for siphons. However, the existing conditions, max, max‧, and max″-controllability of siphons are all overly restrictive and generally sufficient only. This paper develops a new condition called max*-controllability of the siphons in generalised systems of simple sequential processes with resources (GS3PR), which are a net subclass that can model many real-world automated manufacturing systems. We show that a GS3PR is live if all its strict minimal siphons (SMS) are max*-controlled. Compared with the existing conditions, i.e., max-, max‧-, and max″-controllability of siphons, max*-controllability of the SMS is not only sufficient but also necessary. An example is used to illustrate the proposed method.

  2. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  3. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  4. Design of fuel cell powered data centers for sufficient reliability and availability

    Science.gov (United States)

    Ritchie, Alexa J.; Brouwer, Jacob

    2018-04-01

    It is challenging to design a sufficiently reliable fuel cell electrical system for use in data centers, which require 99.9999% uptime. Such a system could lower emissions and increase data center efficiency, but the reliability and availability of such a system must be analyzed and understood. Currently, extensive backup equipment is used to ensure electricity availability. The proposed design alternative uses multiple fuel cell systems each supporting a small number of servers to eliminate backup power equipment provided the fuel cell design has sufficient reliability and availability. Potential system designs are explored for the entire data center and for individual fuel cells. Reliability block diagram analysis of the fuel cell systems was accomplished to understand the reliability of the systems without repair or redundant technologies. From this analysis, it was apparent that redundant components would be necessary. A program was written in MATLAB to show that the desired system reliability could be achieved by a combination of parallel components, regardless of the number of additional components needed. Having shown that the desired reliability was achievable through some combination of components, a dynamic programming analysis was undertaken to assess the ideal allocation of parallel components.

  5. Fuel self-sufficient and low proliferation risk multi-recycling of spent fuel

    International Nuclear Information System (INIS)

    Cho, N. Z.; Hong, S. G.; Kim, T. H.; Greenspan, E.; Kastenberg, W. E.

    1998-01-01

    A preliminary feasibility study has been performed in search of promising nuclear energy systems which could make efficient use of the spent fuel from LWRs and be proliferation resistant. The energy considered consist of a dry process and a fuel-self-sufficient reactor which are synergistic. D 2 O, H 2 O and Pb (or Pb-Bi) are considered for the coolant. The most promising identified consists of Pb-cooled reactors with either an AIROX or an IFR-like reprocessing. H 2 O- (possibly mixed with D 2 O) cooled reactors can be designed to be fuel-self-sufficient and multi-recycle LWR spent fuel, provided they are accelerator driven. Moderator-free, D 2 O-cooled critical reactors can multi-recycle Th- 233 U fuel using IFR-type reprocessing; they are significantly more attractive than their thermal counterparts. H 2 O- (possibly mixed with D 2 O) cooled, accelerator-driven reactors appear attractive for converting Th into denatured 233 U using LWR spent fuel and the IFR process. The CANDU reactor technology appears highly synergistic with accelerator-driven systems. (author). 25 refs., 3 tabs., 6 figs

  6. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  7. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  8. Statistics without Tears: Complex Statistics with Simple Arithmetic

    Science.gov (United States)

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  9. Stochastic simulations for the time evolution of systems which obey generalized statistics: fractional exclusion statistics and Gentile's statistics

    International Nuclear Information System (INIS)

    Nemnes, G A; Anghel, D V

    2010-01-01

    We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size

  10. Direct numerical simulation and statistical analysis of turbulent convection in lead-bismuth

    Energy Technology Data Exchange (ETDEWEB)

    Otic, I.; Grotzbach, G. [Forschungszentrum Karlsruhe GmbH, Institut fuer Kern-und Energietechnik (Germany)

    2003-07-01

    Improved turbulent heat flux models are required to develop and analyze the reactor concept of an lead-bismuth cooled Accelerator-Driven-System. Because of specific properties of many liquid metals we have still no sensors for accurate measurements of the high frequency velocity fluctuations. So, the development of the turbulent heat transfer models which are required in our CFD (computational fluid dynamics) tools needs also data from direct numerical simulations of turbulent flows. We use new simulation results for the model problem of Rayleigh-Benard convection to show some peculiarities of the turbulent natural convection in lead-bismuth (Pr = 0.025). Simulations for this flow at sufficiently large turbulence levels became only recently feasible because this flow requires the resolution of very small velocity scales with the need for recording long-wave structures for the slow changes in the convective temperature field. The results are analyzed regarding the principle convection and heat transfer features. They are also used to perform statistical analysis to show that the currently available modeling is indeed not adequate for these fluids. Basing on the knowledge of the details of the statistical features of turbulence in this convection type and using the two-point correlation technique, a proposal for an improved statistical turbulence model is developed which is expected to account better for the peculiarities of the heat transfer in the turbulent convection in low Prandtl number fluids. (authors)

  11. Can British Columbia Achieve Electricity Self-Sufficiency and Meet its Renewable Portfolio Standard?

    NARCIS (Netherlands)

    Sopinka, A.; Kooten, van G.C.; Wong, L.

    2012-01-01

    British Columbia’s energy policy is at a crossroads; the province has set a goal of electricity self-sufficiency, a 93% renewable portfolio standard and provincial natural gas strategy that could increase electricity consumption by 2,500-3,800 MW. To ascertain the reality of BC’s supply position, we

  12. Energy Strategic Planning & Self-Sufficiency Project

    Energy Technology Data Exchange (ETDEWEB)

    Greg Retzlaff

    2005-03-30

    This report provides information regarding options available, their advantages and disadvantages, and the costs for pursuing activities to advance Smith River Rancheria toward an energy program that reduces their energy costs, allows greater self-sufficiency and stimulates economic development and employment opportunities within and around the reservation. The primary subjects addressed in this report are as follow: (1) Baseline Assessment of Current Energy Costs--An evaluation of the historical energy costs for Smith River was conducted to identify the costs for each component of their energy supply to better assess changes that can be considered for energy cost reductions. (2) Research Viable Energy Options--This includes a general description of many power generation technologies and identification of their relative costs, advantages and disadvantages. Through this research the generation technology options that are most suited for this application were identified. (3) Project Development Considerations--The basic steps and associated challenges of developing a generation project utilizing the selected technologies are identified and discussed. This included items like selling to third parties, wheeling, electrical interconnections, fuel supply, permitting, standby power, and transmission studies. (4) Energy Conservation--The myriad of federal, state and utility programs offered for low-income weatherization and utility bill payment assistance are identified, their qualification requirements discussed, and the subsequent benefits outlined. (5) Establishing an Energy Organization--The report includes a high level discussion of formation of a utility to serve the Tribal membership. The value or advantages of such action is discussed along with some of the challenges. (6) Training--Training opportunities available to the Tribal membership are identified.

  13. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  14. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  15. Soil nuclide distribution coefficients and their statistical distributions

    International Nuclear Information System (INIS)

    Sheppard, M.I.; Beals, D.I.; Thibault, D.H.; O'Connor, P.

    1984-12-01

    Environmental assessments of the disposal of nuclear fuel waste in plutonic rock formations require analysis of the migration of nuclides from the disposal vault to the biosphere. Analyses of nuclide migration via groundwater through the disposal vault, the buffer and backfill, the plutonic rock, and the consolidated and unconsolidated overburden use models requiring distribution coefficients (Ksub(d)) to describe the interaction of the nuclides with the geological and man-made materials. This report presents element-specific soil distribution coefficients and their statistical distributions, based on a detailed survey of the literature. Radioactive elements considered were actinium, americium, bismuth, calcium, carbon, cerium, cesium, iodine, lead, molybdenum, neptunium, nickel, niobium, palladium, plutonium, polonium, protactinium, radium, samarium, selenium, silver, strontium, technetium, terbium, thorium, tin, uranium and zirconium. Stable elements considered were antimony, boron, cadmium, tellurium and zinc. Where sufficient data were available, distribution coefficients and their distributions are given for sand, silt, clay and organic soils. Our values are recommended for use in assessments for the Canadian Nuclear Fuel Waste Management Program

  16. 41 CFR 102-5.95 - Is the comfort and/or convenience of an employee considered sufficient justification to authorize...

    Science.gov (United States)

    2010-07-01

    ... convenience of an employee considered sufficient justification to authorize home-to-work transportation? 102-5...-to-Work Transportation § 102-5.95 Is the comfort and/or convenience of an employee considered sufficient justification to authorize home-to-work transportation? No, the comfort and/or convenience of an...

  17. Maximizing Health or Sufficient Capability in Economic Evaluation? A Methodological Experiment of Treatment for Drug Addiction.

    Science.gov (United States)

    Goranitis, Ilias; Coast, Joanna; Day, Ed; Copello, Alex; Freemantle, Nick; Frew, Emma

    2017-07-01

    Conventional practice within the United Kingdom and beyond is to conduct economic evaluations with "health" as evaluative space and "health maximization" as the decision-making rule. However, there is increasing recognition that this evaluative framework may not always be appropriate, and this is particularly the case within public health and social care contexts. This article presents a methodological case study designed to explore the impact of changing the evaluative space within an economic evaluation from health to capability well-being and the decision-making rule from health maximization to the maximization of sufficient capability. Capability well-being is an evaluative space grounded on Amartya Sen's capability approach and assesses well-being based on individuals' ability to do and be the things they value in life. Sufficient capability is an egalitarian approach to decision making that aims to ensure everyone in society achieves a normatively sufficient level of capability well-being. The case study is treatment for drug addiction, and the cost-effectiveness of 2 psychological interventions relative to usual care is assessed using data from a pilot trial. Analyses are undertaken from a health care and a government perspective. For the purpose of the study, quality-adjusted life years (measured using the EQ-5D-5L) and years of full capability equivalent and years of sufficient capability equivalent (both measured using the ICECAP-A [ICEpop CAPability measure for Adults]) are estimated. The study concludes that different evaluative spaces and decision-making rules have the potential to offer opposing treatment recommendations. The implications for policy makers are discussed.

  18. A Sufficient Condition on Convex Relaxation of AC Optimal Power Flow in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Wang, Jianhui

    2016-01-01

    This paper proposes a sufficient condition for the convex relaxation of AC Optimal Power Flow (OPF) in radial distribution networks as a second order cone program (SOCP) to be exact. The condition requires that the allowed reverse power flow is only reactive or active, or none. Under the proposed...... solution of the SOCP can be converted to an optimal solution of the original AC OPF. The efficacy of the convex relaxation to solve the AC OPF is demonstrated by case studies of an optimal multi-period planning problem of electric vehicles (EVs) in distribution networks....... sufficient condition, the feasible sub-injection region (power injections of nodes excluding the root node) of the AC OPF is convex. The exactness of the convex relaxation under the proposed condition is proved through constructing a group of monotonic series with limits, which ensures that the optimal...

  19. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  20. India's baseline plan for nuclear energy self-sufficiency

    International Nuclear Information System (INIS)

    Bucher, R.G.

    2009-01-01

    India's nuclear energy strategy has traditionally strived for energy self-sufficiency, driven largely by necessity following trade restrictions imposed by the Nuclear Suppliers Group (NSG) following India's 'peaceful nuclear explosion' of 1974. On September 6, 2008, the NSG agreed to create an exception opening nuclear trade with India, which may create opportunities for India to modify its baseline strategy. The purpose of this document is to describe India's 'baseline plan,' which was developed under constrained trade conditions, as a basis for understanding changes in India's path as a result of the opening of nuclear commerce. Note that this treatise is based upon publicly available information. No attempt is made to judge whether India can meet specified goals either in scope or schedule. In fact, the reader is warned a priori that India's delivery of stated goals has often fallen short or taken a significantly longer period to accomplish. It has been evident since the early days of nuclear power that India's natural resources would determine the direction of its civil nuclear power program. It's modest uranium but vast thorium reserves dictated that the country's primary objective would be thorium utilization. Estimates of India's natural deposits vary appreciably, but its uranium reserves are known to be extremely limited, totaling approximately 80,000 tons, on the order of 1% of the world's deposits; and nominally one-third of this ore is of very low uranium concentration. However, India's roughly 300,000 tons of thorium reserves account for approximately 30% of the world's total. Confronted with this reality, the future of India's nuclear power industry is strongly dependent on the development of a thorium-based nuclear fuel cycle as the only way to insure a stable, sustainable, and autonomous program. The path to India's nuclear energy self-sufficiency was first outlined in a seminal paper by Drs. H. J. Bhabha and N. B. Prasad presented at the Second

  1. A sufficiency property arising from the characterization of extremes of Markov chains

    OpenAIRE

    Bortot, Paola; Coles, Stuart

    2000-01-01

    At extreme levels, it is known that for a particular choice of marginal distribution, transitions of a Markov chain behave like a random walk. For a broad class of Markov chains, we give a characterization for the step length density of the limiting random walk, which leads to an interesting sufficiency property. This representation also leads us to propose a new technique for kernel density estimation for this class of models.

  2. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  3. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  4. The role of ensemble-based statistics in variational assimilation of cloud-affected observations from infrared imagers

    Science.gov (United States)

    Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris

    2017-04-01

    Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the

  5. Virtual water and water self-sufficiency in agricultural and livestock products in Brazil.

    Science.gov (United States)

    da Silva, Vicente de Paulo R; de Oliveira, Sonaly D; Braga, Célia C; Brito, José Ivaldo B; de Sousa, Francisco de Assis S; de Holanda, Romildo M; Campos, João Hugo B C; de Souza, Enio P; Braga, Armando César R; Rodrigues Almeida, Rafaela S; de Araújo, Lincoln E

    2016-12-15

    Virtual water trade is often considered a solution for restricted water availability in many regions of the world. Brazil is the world leader in the production and export of various agricultural and livestock products. The country is either a strong net importer or a strong net exporter of these products. The objective of this study is to determine the volume of virtual water contained in agricultural and livestock products imported/exported by Brazil from 1997 to 2012, and to define the water self-sufficiency index of agricultural and livestock products in Brazil. The indexes of water scarcity (WSI), water dependency (WDI) and water self-sufficiency (WSSI) were calculated for each Brazilian state. These indexes and the virtual water balance were calculated following the methodology developed by Chapagain and Hoekstra (2008) and Hoekstra and Hung (2005). The total water exports and imports embedded in agricultural and livestock products were 5.28 × 10 10 and 1.22 × 10 10  Gm 3  yr -1 , respectively, which results in positive virtual water balance of 4.05 × 10 10  Gm 3  yr -1 . Brazil is either a strong net importer or a strong net exporter of agricultural and livestock products among the Mercosur countries. Brazil has a positive virtual water balance of 1.85 × 10 10  Gm 3  yr -1 . The indexes used in this study reveal that Brazil is self-sufficient in food production, except for a few products such as wheat and rice. Horticultural products (tomato, onion, potato, cassava and garlic) make up a unique product group with negative virtual water balance in Brazil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Short-Term Solar Irradiance Forecasting Model Based on Artificial Neural Network Using Statistical Feature Parameters

    Directory of Open Access Journals (Sweden)

    Hongshan Zhao

    2012-05-01

    Full Text Available Short-term solar irradiance forecasting (STSIF is of great significance for the optimal operation and power predication of grid-connected photovoltaic (PV plants. However, STSIF is very complex to handle due to the random and nonlinear characteristics of solar irradiance under changeable weather conditions. Artificial Neural Network (ANN is suitable for STSIF modeling and many research works on this topic are presented, but the conciseness and robustness of the existing models still need to be improved. After discussing the relation between weather variations and irradiance, the characteristics of the statistical feature parameters of irradiance under different weather conditions are figured out. A novel ANN model using statistical feature parameters (ANN-SFP for STSIF is proposed in this paper. The input vector is reconstructed with several statistical feature parameters of irradiance and ambient temperature. Thus sufficient information can be effectively extracted from relatively few inputs and the model complexity is reduced. The model structure is determined by cross-validation (CV, and the Levenberg-Marquardt algorithm (LMA is used for the network training. Simulations are carried out to validate and compare the proposed model with the conventional ANN model using historical data series (ANN-HDS, and the results indicated that the forecast accuracy is obviously improved under variable weather conditions.

  7. Statistical Anxiety and Attitudes Towards Statistics: Development of a Comprehensive Danish Instrument

    DEFF Research Database (Denmark)

    Nielsen, Tine; Kreiner, Svend

    Short abstract Motivated by experiencing with students’ psychological barriers for learning statistics we modified and extended the Statistical Anxiety Rating Scale (STARS) to develop a contemporary Danish measure of attitudes and relationship to statistics for use with higher education students...... with evidence of DIF in all cases: One TCA-item functioned differentially relative to age, one WS-item functioned differentially relative to statistics course (first or second), and two IA-items functioned differentially relative to statistics course and academic discipline (sociology, public health...

  8. Necessary and Sufficient Process leading to Work Smart Standards. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    The Necessary and Sufficient Process leading to Work Smart Standards is a Department of Energy initiative to assure adequate protection for workers, the public, and the environment. The Work Smart Standards initiative directs the Laboratory to develop a set of ES and H standards based on the work performed at the Laboratory and the hazards associated with the work. Berkeley Lab`s set of Work Smart Standards includes required Federal, State and local laws and, additionally, national and international standards which represent the highest operating standards of industrial and commercial institutions.

  9. Necessary and Sufficient Process leading to Work Smart Standards. Final report

    International Nuclear Information System (INIS)

    1996-11-01

    The Necessary and Sufficient Process leading to Work Smart Standards is a Department of Energy initiative to assure adequate protection for workers, the public, and the environment. The Work Smart Standards initiative directs the Laboratory to develop a set of ES and H standards based on the work performed at the Laboratory and the hazards associated with the work. Berkeley Lab's set of Work Smart Standards includes required Federal, State and local laws and, additionally, national and international standards which represent the highest operating standards of industrial and commercial institutions

  10. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  11. Osteocalcin is necessary and sufficient to maintain muscle mass in older mice

    Directory of Open Access Journals (Sweden)

    Paula Mera

    2016-10-01

    Full Text Available Objective: A decrease in muscle protein turnover and therefore in muscle mass is a hallmark of aging. Because the circulating levels of the bone-derived hormone osteocalcin decline steeply during aging in mice, monkeys and humans we asked here whether this hormone might regulate muscle mass as mice age. Methods: We examined muscle mass and strength in mice lacking osteocalcin (Ocn−/− or its receptor in all cells (Gprc6a−/− or specifically in myofibers (Gprc6aMck−/− as well as in 9 month-old WT mice receiving exogenous osteocalcin for 28 days. We also examined protein synthesis in WT and Gprc6a−/− mouse myotubes treated with osteocalcin. Results: We show that osteocalcin signaling in myofibers is necessary to maintain muscle mass in older mice in part because it promotes protein synthesis in myotubes without affecting protein breakdown. We further show that treatment with exogenous osteocalcin for 28 days is sufficient to increase muscle mass of 9-month-old WT mice. Conclusion: This study uncovers that osteocalcin is necessary and sufficient to prevent age-related muscle loss in mice. Author Video: Author Video Watch what authors say about their articles Keywords: Osteocalcin, Muscle mass, Aging

  12. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  13. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  14. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  15. Guidelines for the Review of Environmental-Related Legislation Regarding the Realisation of the Right to Access to Sufficient Food

    Directory of Open Access Journals (Sweden)

    Inge Snyman

    2015-12-01

    Full Text Available The development of legislation for the progressive realisation of the right to access to sufficient food is labelled as an international and national objective. Section 27(2 of the Constitution of the Republic of South Africa, 1996 assigns a compulsory mandate to the South African government to take reasonable legislative and other measures, within its available resources, to achieve the progressive realisation of the right to access to sufficient food. The United Nations' Food and Agricultural Organization (FAO proposes a three-level strategy for the implementation of the right to food on a national legislative level, namely through: constitutional recognition, the implementation of a food framework law and the reviewing of relevant sectoral legislation. This contribution focuses on the last level of legislative provisioning, namely the reviewing of relevant sectoral legislation which influences, or possibly can, influence the realisation of the right to access to sufficient food. The right to access to sufficient food has multidimensional, interdisciplinary and cross-sectoral characteristics and consequently various sectors are involved in the realisation of the right to access to sufficient food. The FAO determines that the intended purpose will be to identify and review all sectoral legislation that might influence the availability, stability, access and adequacy of food, by means of a proposed reviewing process. The suggested reviewing process of the FAO is comprehensive and diverse; therefore the focus of this contribution is based on the reviewing of relevant environmental-related legislation only. The FAO does not make recommendations with regard to the specific aspects that need to be incorporated in environmental-related legislation to contribute to the progressive realisation of the right to access to sufficient food (in other words the aspects against which environmental-related legislation can be evaluated. Therefore this

  16. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  17. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  18. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  19. ASYMPTOTIC COMPARISONS OF U-STATISTICS, V-STATISTICS AND LIMITS OF BAYES ESTIMATES BY DEFICIENCIES

    OpenAIRE

    Toshifumi, Nomachi; Hajime, Yamato; Graduate School of Science and Engineering, Kagoshima University:Miyakonojo College of Technology; Faculty of Science, Kagoshima University

    2001-01-01

    As estimators of estimable parameters, we consider three statistics which are U-statistic, V-statistic and limit of Bayes estimate. This limit of Bayes estimate, called LB-statistic in this paper, is obtained from Bayes estimate of estimable parameter based on Dirichlet process, by letting its parameter tend to zero. For the estimable parameter with non-degenerate kernel, the asymptotic relative efficiencies of LB-statistic with respect to U-statistic and V-statistic and that of V-statistic w...

  20. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  1. Identification of necessary and sufficient conditions for real non-negativeness of rational matrices

    International Nuclear Information System (INIS)

    Saeed, K.

    1982-12-01

    The necessary and sufficient conditions for real non-negativeness of rational matrices have been identified. A programmable algorithm is developed and is given with its computer flow chart. This algorithm can be used as a general solution to test the real non-negativeness of rational matrices. The computer program assures the feasibility of the suggested algorithm. (author)

  2. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  3. Analyzing the impact of price subsidy on rice self-sufficiency level in Malaysia: A preliminary finding

    Science.gov (United States)

    Rahim, Farah Hanim Abdul; Abidin, Norhaslinda Zainal; Hawari, Nurul Nazihah

    2017-11-01

    The Malaysian government had targeted for the rice industry in the country to achieve 100% rice self-sufficiency where Malaysia's rice self-sufficiency level (SSL) is currently at 65% to 75%. Thus, the government had implemented few policies to increase the rice production in Malaysia in order to meet the growing demand of rice. In this paper, the effect of price support on the rice production system in Malaysia is investigated. This study utilizes the system dynamics approach of the rice production system in Malaysia where the complexity of the factor is interrelated and changed dynamically through time. Scenario analysis was conducted using system dynamics model by making changes on the price subsidy to see its effect on the rice production and rice SSL. The system dynamics model provides a framework for understanding the effect of price subsidy on the rice self-sufficiency level. The scenario analysis of the model shows that a 50% increase in the price subsidy leads to a substantial increase in demand as the rice price drops. Accordingly, the local production increases by 15%. However, the SSL slightly decreases as the local production is insufficient to meet the large demand.

  4. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  5. Forty Necessary and Sufficient Conditions for Regularity of Interval Matrices: A survey

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2009-01-01

    Roč. 18, - (2009), s. 500-512 E-ISSN 1081-3810 R&D Projects: GA ČR GA201/09/1957; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : interval matrix * regularity * singularity * necessary and sufficient condition * algorithm Subject RIV: BA - General Mathematics Impact factor: 0.892, year: 2009 http://www.math.technion.ac.il/iic/ ela / ela -articles/articles/vol18_pp500-512.pdf

  6. The Statistics of GPS

    National Research Council Canada - National Science Library

    Matsakis, Demetrios

    2007-01-01

    The Global Positioning System (GPS) is an extremely effective satellite-based system that broadcasts sufficient information for a user to determine time and position from any location on or near the Earth...

  7. Faculty Sufficiency and AACSB Accreditation Compliance within a Global University: A Mathematical Modeling Approach

    Science.gov (United States)

    Boronico, Jess; Murdy, Jim; Kong, Xinlu

    2014-01-01

    This manuscript proposes a mathematical model to address faculty sufficiency requirements towards assuring overall high quality management education at a global university. Constraining elements include full-time faculty coverage by discipline, location, and program, across multiple campus locations subject to stated service quality standards of…

  8. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  9. Sufficiency of the Nuclear Fuel

    International Nuclear Information System (INIS)

    Pevec, D.; Knapp, V.; Matijevic, M.

    2008-01-01

    Estimation of the nuclear fuel sufficiency is required for rational decision making on long-term energy strategy. In the past an argument often invoked against nuclear energy was that uranium resources are inadequate. At present, when climate change associated with CO 2 emission is a major concern, one novel strong argument for nuclear energy is that it can produce large amounts of energy without the CO 2 emission. Increased interest in nuclear energy is evident, and a new look into uranium resources is relevant. We examined three different scenarios of nuclear capacity growth. The low growth of 0.4 percent per year in nuclear capacity is assumed for the first scenario. The moderate growth of 1.5 percent per year in nuclear capacity preserving the present share in total energy production is assumed for the second scenario. We estimated draining out time periods for conventional resources of uranium using once through fuel cycle for the both scenarios. For the first and the second scenario we obtained the draining out time periods for conventional uranium resources of 154 years and 96 years, respectively. These results are, as expected, in agreement with usual evaluations. However, if nuclear energy is to make a major impact on CO 2 emission it should contribute much more in the total energy production than at present level of 6 percent. We therefore defined the third scenario which would increase nuclear share in the total energy production from 6 percent in year 2020 to 30 percent by year 2060 while the total world energy production would grow by 1.5 percent per year. We also looked into the uranium requirement for this scenario, determining the time window for introduction of uranium or thorium reprocessing and for better use of uranium than what is the case in the once through fuel cycle. The once through cycle would be in this scenario sustainable up to about year 2060 providing most of the expected but undiscovered conventional uranium resources were turned

  10. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  11. Early nutritional support and physiotherapy improved long-term self-sufficiency in acutely ill older patients.

    Science.gov (United States)

    Hegerová, Petra; Dědková, Zuzana; Sobotka, Luboš

    2015-01-01

    An acute disease is regularly associated with inflammation, decreased food intake, and low physical activity; the consequence is loss of muscle mass. However, the restoration of muscle tissue is problematic, especially in older patients. Loss of muscle mass leads to further decrease of physical activity which leads, together with recurring disease, to the progressive muscle mass loss accompanied by loss of self-sufficiency. Early nutrition support and physical activity could reverse this situation. Therefore, the aim of this study was to determine whether an active approach based on early nutritional therapy and exercise would influence the development of sarcopenia and impaired self-sufficiency during acute illness. Two hundred patients >78 y were admitted to a hospital internal medicine department and participated in a prospective, randomized controlled study. The patients were randomized to a control group receiving standard treatment (n = 100) or to an intervention group (n = 100). The intervention consisted of nutritional supplements (600 kcal, 20 g/d protein) added to a standard diet and a simultaneous intensive rehabilitation program. The tolerance of supplements and their influence on spontaneous food intake, self-sufficiency, muscle strength, and body composition were evaluated during the study period. The patients were then regularly monitored for 1 y post-discharge. The provision of nutritional supplements together with early rehabilitation led to increased total energy and protein intake while the intake of standard hospital food was not reduced. The loss of lean body mass and a decrease in self-sufficiency were apparent at discharge from the hospital and 3 mo thereafter in the control group. Nutritional supplementation and the rehabilitation program in the study group prevented these alterations. A positive effect of nutritional intervention and exercise during the hospital stay was apparent at 6 mo post-discharge. The early nutritional intervention

  12. Transport Statistics - Transport - UNECE

    Science.gov (United States)

    Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6

  13. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    Science.gov (United States)

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https

  14. Requirements on qualification, competence and sufficient number of personnel for NPP operation

    International Nuclear Information System (INIS)

    Simon, M.

    2004-01-01

    The safe operation of NPPs presupposes qualified personnel on site in sufficient numbers. While the acquisition and preservation of technical expertise and the qualification of the shift personnel and other staff is well regulated by regulatory guidelines in Germany, there is a lack of such regulations with the exception for shift personnel - for the minimum number of technical personnel required for safe operation of a NPP. By order of the BMU, an attempt was made with this study to work out the requirements for qualification, competence and number of personnel to be maintained at the plant, representing the minimum requirements for safe operation of a NPP. The scope of the project was restricted to requirements for technical plant personnel. The aim was to work out requirements which would be as independent as possible of the existing organisation in a particular power plant. This study therefore does not assume a given organisational structure but was rather more oriented on the work processes in a NPP which are the basis for planning and performing routine work in the plant. For the study a work process model of typical tasks in a NPP had to be developed. Then, the tasks to be performed within the so defined work processes were described (task profiles) on the basis of existing manuals for plant organisation. From these task profiles such tasks were defined or selected which shall not be delegated to external personnel for specific reasons, and which were called vital competences. To keep these vital competences at the plant, an assessment and/or calculation of the necessary number of plant technical personnel was made using the task profiles for responsible personnel, but also by the evaluation of thousands of work orders for maintenance personnel. On the basis of these data, a proposal was made for the minimal number of technical personnel which is necessary to operate a NPP unit safely. Beside of this number, general criteria were developed which should be

  15. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  16. Measuring what latent fingerprint examiners consider sufficient information for individualization determinations.

    Directory of Open Access Journals (Sweden)

    Bradford T Ulery

    Full Text Available Latent print examiners use their expertise to determine whether the information present in a comparison of two fingerprints (or palmprints is sufficient to conclude that the prints were from the same source (individualization. When fingerprint evidence is presented in court, it is the examiner's determination--not an objective metric--that is presented. This study was designed to ascertain the factors that explain examiners' determinations of sufficiency for individualization. Volunteer latent print examiners (n = 170 were each assigned 22 pairs of latent and exemplar prints for examination, and annotated features, correspondence of features, and clarity. The 320 image pairs were selected specifically to control clarity and quantity of features. The predominant factor differentiating annotations associated with individualization and inconclusive determinations is the count of corresponding minutiae; other factors such as clarity provided minimal additional discriminative value. Examiners' counts of corresponding minutiae were strongly associated with their own determinations; however, due to substantial variation of both annotations and determinations among examiners, one examiner's annotation and determination on a given comparison is a relatively weak predictor of whether another examiner would individualize. The extensive variability in annotations also means that we must treat any individual examiner's minutia counts as interpretations of the (unknowable information content of the prints: saying "the prints had N corresponding minutiae marked" is not the same as "the prints had N corresponding minutiae." More consistency in annotations, which could be achieved through standardization and training, should lead to process improvements and provide greater transparency in casework.

  17. The scientifiv way of thinking in statistics, statistical physics and quantum mechanics

    OpenAIRE

    Săvoiu, Gheorghe

    2008-01-01

    This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...

  18. The scientific way of thinking in statistics, statistical physics and quantum mechanics

    OpenAIRE

    Săvoiu, Gheorghe

    2008-01-01

    This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...

  19. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  20. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  1. Increased conjugated bilirubin is sufficient to initiate screening for biliary atresia

    DEFF Research Database (Denmark)

    Madsen, Stine Skipper; Kvist, Nina; Thorup, Jørgen

    2015-01-01

    . This percentage value has caused diagnostic trouble over the years. The objective of the present study was to investigate the possibility of changing the recommendations. METHODS: This was a retrospective analysis of the medical records of children operated for biliary atresia in the 1993-2012 period. RESULTS......: mean 129.7 μmol/l (42-334 μmol/l) and 73% (28-97%), respectively. CONCLUSION: The total amount of conjugated bilirubin above 20 μmol/l is sufficient to require further evaluation for biliary atresia. The percentage value is unnecessary and may cause confusion. FUNDING: none. TRIAL REGISTRATION...

  2. Statistical Reasoning Ability, Self-Efficacy, and Value Beliefs in a University Statistics Course

    Science.gov (United States)

    Olani, A.; Hoekstra, R.; Harskamp, E.; van der Werf, G.

    2011-01-01

    Introduction: The study investigated the degree to which students' statistical reasoning abilities, statistics self-efficacy, and perceived value of statistics improved during a reform based introductory statistics course. The study also examined whether the changes in these learning outcomes differed with respect to the students' mathematical…

  3. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  4. The dynamic simulation model of soybean in Central Java to support food self sufficiency: A supply chain perspective

    Science.gov (United States)

    Oktyajati, Nancy; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    Consider food become one of the basic human needs in order to survive so food sufficiency become very important. Food sufficiency of soybean commodity in Central Java still depends on imported soybean. Insufficiency of soybean because of there is much gap between local soybean productions and its demand. In the year 2016 the shortage of supply soybean commodity as much 68.79%. Soybean is an important and strategic commodity after rice and corn. The increasing consumption of soybean is related to increasing population, increasing incomes, changing of healthy life style. The aims of this study are to determine the soybean dynamic model based on supply chain perspective, define the proper price of local soybean to trigger increasing of local production, and to define the alternative solution to support food self sufficiency. This study will capture the real condition into dynamics model, then simulate a series of scenario into a computer program to obtain the best results. This study will be conducted the following first scenario with government intervention policy and second without government intervention policy. The best solution of the alternative can be used as government consideration for governmental policy. The results of the propose scenarios showed that self sufficiency on soybean can be achieved after the next 20 years by increasing planting area 4% and land productivity 1% per year.

  5. Computational Performance Optimisation for Statistical Analysis of the Effect of Nano-CMOS Variability on Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Zheng Xie

    2013-01-01

    Full Text Available The intrinsic variability of nanoscale VLSI technology must be taken into account when analyzing circuit designs to predict likely yield. Monte-Carlo- (MC- and quasi-MC- (QMC- based statistical techniques do this by analysing many randomised or quasirandomised copies of circuits. The randomisation must model forms of variability that occur in nano-CMOS technology, including “atomistic” effects without intradie correlation and effects with intradie correlation between neighbouring devices. A major problem is the computational cost of carrying out sufficient analyses to produce statistically reliable results. The use of principal components analysis, behavioural modeling, and an implementation of “Statistical Blockade” (SB is shown to be capable of achieving significant reduction in the computational costs. A computation time reduction of 98.7% was achieved for a commonly used asynchronous circuit element. Replacing MC by QMC analysis can achieve further computation reduction, and this is illustrated for more complex circuits, with the results being compared with those of transistor-level simulations. The “yield prediction” analysis of SRAM arrays is taken as a case study, where the arrays contain up to 1536 transistors modelled using parameters appropriate to 35 nm technology. It is reported that savings of up to 99.85% in computation time were obtained.

  6. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  7. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  8. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  9. Vector field statistical analysis of kinematic and force trajectories.

    Science.gov (United States)

    Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos

    2013-09-27

    When investigating the dynamics of three-dimensional multi-body biomechanical systems it is often difficult to derive spatiotemporally directed predictions regarding experimentally induced effects. A paradigm of 'non-directed' hypothesis testing has emerged in the literature as a result. Non-directed analyses typically consist of ad hoc scalar extraction, an approach which substantially simplifies the original, highly multivariate datasets (many time points, many vector components). This paper describes a commensurately multivariate method as an alternative to scalar extraction. The method, called 'statistical parametric mapping' (SPM), uses random field theory to objectively identify field regions which co-vary significantly with the experimental design. We compared SPM to scalar extraction by re-analyzing three publicly available datasets: 3D knee kinematics, a ten-muscle force system, and 3D ground reaction forces. Scalar extraction was found to bias the analyses of all three datasets by failing to consider sufficient portions of the dataset, and/or by failing to consider covariance amongst vector components. SPM overcame both problems by conducting hypothesis testing at the (massively multivariate) vector trajectory level, with random field corrections simultaneously accounting for temporal correlation and vector covariance. While SPM has been widely demonstrated to be effective for analyzing 3D scalar fields, the current results are the first to demonstrate its effectiveness for 1D vector field analysis. It was concluded that SPM offers a generalized, statistically comprehensive solution to scalar extraction's over-simplification of vector trajectories, thereby making it useful for objectively guiding analyses of complex biomechanical systems. © 2013 Published by Elsevier Ltd. All rights reserved.

  10. A necessary and sufficient condition for a real quadratic extension to have class number one

    International Nuclear Information System (INIS)

    Alemu, Y.

    1990-02-01

    We give a necessary and sufficient condition for a real quadratic extension to have class number one and discuss the applicability of the result to find the class number one fields with small discriminant. 9 refs, 3 tabs

  11. Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists

    Directory of Open Access Journals (Sweden)

    Laura Badenes-Ribera

    2018-06-01

    Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not

  12. Statistical Physics of Neural Systems with Nonadditive Dendritic Coupling

    Directory of Open Access Journals (Sweden)

    David Breuer

    2014-03-01

    Full Text Available How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such nonadditive dendritic processing on single-neuron responses and the performance of associative-memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality.

  13. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  14. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  15. The power and statistical behaviour of allele-sharing statistics when ...

    Indian Academy of Sciences (India)

    , using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single ...

  16. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently abstract in order to handle a large spectrum of models, (b) be algorithmically extensible, allowing an easy insertion of new and improved algorithms, and (c) take advantage of parallel computing, in order to handle realistic models. Such objectives demand a combination of an object-oriented design with robust software engineering practices. QUESO is written in C++, uses MPI, and leverages libraries already available to the scientific community. We describe some UQ concepts, present QUESO, and list planned enhancements.

  17. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  18. SUSTAINING PADDY SELF-SUFFICIENCY AND LAND DEMANDS IN SABAH, MALAYSIA: A STRUCTURAL PADDY AND RICE ECONOMETRIC MODEL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Wong Kelly_Kai_Seng

    2017-01-01

    Full Text Available The objective of this study is to construct an econometric commodity model in order to forecast the long term rice production performance of the state of Sabah, Malaysia. The baseline projection shows that the Sabah rice self-sufficiency is estimated to achieve approximately38% in the next 10 years due to the scarcity of the suitable land bank allocate for paddy cultivation. In order to achieve 60% of targeted rice self-sufficiency level (SSL, the size of land for paddy cultivation must be increased in Sabah. Based on the scenario simulation projection result, the expansion of paddy cultivation area will contribute a positively to the industrial rice production and consequently achieving the expected 60% of SSL by the end of 2024. In a nutshell, the state government of Sabah possess state autonomy on the land management, thus the state government plays a significant key role on promoting the local rice self-sufficiency level in the long-term period

  19. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  20. Self-sufficiency of an autonomous reconfigurable modular robotic organism

    CERN Document Server

    Qadir, Raja Humza

    2015-01-01

    This book describes how the principle of self-sufficiency can be applied to a reconfigurable modular robotic organism. It shows the design considerations for a novel REPLICATOR robotic platform, both hardware and software, featuring the behavioral characteristics of social insect colonies. Following a comprehensive overview of some of the bio-inspired techniques already available, and of the state-of-the-art in re-configurable modular robotic systems, the book presents a novel power management system with fault-tolerant energy sharing, as well as its implementation in the REPLICATOR robotic modules. In addition, the book discusses, for the first time, the concept of “artificial energy homeostasis” in the context of a modular robotic organism, and shows its verification on a custom-designed simulation framework in different dynamic power distribution and fault tolerance scenarios. This book offers an ideal reference guide for both hardware engineers and software developers involved in the design and implem...

  1. Are emotions necessary and sufficient for making moral judgments?

    Directory of Open Access Journals (Sweden)

    Marco Aurélio Sousa Alves

    2013-07-01

    Full Text Available http://dx.doi.org/10.5007/1677-2954.2013v12n1p113 Jesse Prinz (2006, 2007 claimed that emotions are necessary and sufficient for moral judgments. First of all, I clarify what this claim amounts to. The view that he labels emotionism will then be critically assessed. Prinz marshals empirical findings to defend a series of increasingly strong theses about how emotions are essential for moral judgments. I argue that the empirical support upon which his arguments are based is not only insufficient, but it even suggests otherwise, if properly interpreted. My criticism is then extended to his sentimentalist theory, that accounts for how emotions are integrated into moral judgments. The central problem is that Prinz’s view fails to capture the rational aspect of moral evaluation. I make this failure explicit and defend that some version or other of neosentimentalism is a more promising route.

  2. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  3. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  4. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  5. Guess LOD approach: sufficient conditions for robustness.

    Science.gov (United States)

    Williamson, J A; Amos, C I

    1995-01-01

    Analysis of genetic linkage between a disease and a marker locus requires specifying a genetic model describing both the inheritance pattern and the gene frequencies of the marker and trait loci. Misspecification of the genetic model is likely for etiologically complex diseases. In previous work we have shown through analytic studies that misspecifying the genetic model for disease inheritance does not lead to excess false-positive evidence for genetic linkage provided the genetic marker alleles of all pedigree members are known, or can be inferred without bias from the data. Here, under various selection or ascertainment schemes we extend these previous results to situations in which the genetic model for the marker locus may be incorrect. We provide sufficient conditions for the asymptotic unbiased estimation of the recombination fraction under the null hypothesis of no linkage, and also conditions for the limiting distribution of the likelihood ratio test for no linkage to be chi-squared. Through simulation studies we document some situations under which asymptotic bias can result when the genetic model is misspecified. Among those situations under which an excess of false-positive evidence for genetic linkage can be generated, the most common is failure to provide accurate estimates of the marker allele frequencies. We show that in most cases false-positive evidence for genetic linkage is unlikely to result solely from the misspecification of the genetic model for disease or trait inheritance.

  6. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  7. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  8. Statistics on Lie groups: A need to go beyond the pseudo-Riemannian framework

    Science.gov (United States)

    Miolane, Nina; Pennec, Xavier

    2015-01-01

    Lie groups appear in many fields from Medical Imaging to Robotics. In Medical Imaging and particularly in Computational Anatomy, an organ's shape is often modeled as the deformation of a reference shape, in other words: as an element of a Lie group. In this framework, if one wants to model the variability of the human anatomy, e.g. in order to help diagnosis of diseases, one needs to perform statistics on Lie groups. A Lie group G is a manifold that carries an additional group structure. Statistics on Riemannian manifolds have been well studied with the pioneer work of Fréchet, Karcher and Kendall [1, 2, 3, 4] followed by others [5, 6, 7, 8, 9]. In order to use such a Riemannian structure for statistics on Lie groups, one needs to define a Riemannian metric that is compatible with the group structure, i.e a bi-invariant metric. However, it is well known that general Lie groups which cannot be decomposed into the direct product of compact and abelian groups do not admit a bi-invariant metric. One may wonder if removing the positivity of the metric, thus asking only for a bi-invariant pseudo-Riemannian metric, would be sufficient for most of the groups used in Computational Anatomy. In this paper, we provide an algorithmic procedure that constructs bi-invariant pseudo-metrics on a given Lie group G. The procedure relies on a classification theorem of Medina and Revoy. However in doing so, we prove that most Lie groups do not admit any bi-invariant (pseudo-) metric. We conclude that the (pseudo-) Riemannian setting is not the richest setting if one wants to perform statistics on Lie groups. One may have to rely on another framework, such as affine connection space.

  9. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  10. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  11. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  12. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  13. Wind energy statistics

    International Nuclear Information System (INIS)

    Holttinen, H.; Tammelin, B.; Hyvoenen, R.

    1997-01-01

    The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)

  14. Is there sufficient evidence regarding signage-based stair use interventions? A sequential meta-analysis.

    Science.gov (United States)

    Bauman, Adrian; Milton, Karen; Kariuki, Maina; Fedel, Karla; Lewicka, Mary

    2017-11-28

    The proliferation of studies using motivational signs to promote stair use continues unabated, with their oft-cited potential for increasing population-level physical activity participation. This study examined all stair use promotional signage studies since 1980, calculating pre-estimates and post-estimates of stair use. The aim of this project was to conduct a sequential meta-analysis to pool intervention effects, in order to determine when the evidence base was sufficient for population-wide dissemination. Using comparable data from 50 stair-promoting studies (57 unique estimates) we pooled data to assess the effect sizes of such interventions. At baseline, median stair usage across interventions was 8.1%, with an absolute median increase of 2.2% in stair use following signage-based interventions. The overall pooled OR indicated that participants were 52% more likely to use stairs after exposure to promotional signs (adjusted OR 1.52, 95% CI 1.37 to 1.70). Incremental (sequential) meta-analyses using z-score methods identified that sufficient evidence for stair use interventions has existed since 2006, with recent studies providing no further evidence on the effect sizes of such interventions. This analysis has important policy and practice implications. Researchers continue to publish stair use interventions without connection to policymakers' needs, and few stair use interventions are implemented at a population level. Researchers should move away from repeating short-term, small-scale, stair sign interventions, to investigating their scalability, adoption and fidelity. Only such research translation efforts will provide sufficient evidence of external validity to inform their scaling up to influence population physical activity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  15. The effect of changes to question order on the prevalence of 'sufficient' physical activity in an Australian population survey.

    Science.gov (United States)

    Hanley, Christine; Duncan, Mitch J; Mummery, W Kerry

    2013-03-01

    Population surveys are frequently used to assess prevalence, correlates and health benefits of physical activity. However, nonsampling errors, such as question order effects, in surveys may lead to imprecision in self reported physical activity. This study examined the impact of modified question order in a commonly used physical activity questionnaire on the prevalence of sufficient physical activity. Data were obtained from a telephone survey of adults living in Queensland, Australia. A total of 1243 adults participated in the computer-assisted telephone interview (CATI) survey conducted in July 2008 which included the Active Australia Questionnaire (AAQ) presented in traditional or modified order. Binary logistic regression analyses was used to examine relationships between question order and physical activity outcomes. Significant relationships were found between question order and sufficient activity, recreational walking, moderate activity, vigorous activity, and total activity. Respondents who received the AAQ in modified order were more likely to be categorized as sufficiently active (OR = 1.28, 95% CI 1.01-1.60). This study highlights the importance of question order on estimates of self reported physical activity. This study has shown that changes in question order can lead to an increase in the proportion of participants classified as sufficiently active.

  16. Necessary and sufficient conditions for Hopf bifurcation in tri-neuron equation with a delay

    International Nuclear Information System (INIS)

    Liu Xiaoming; Liao Xiaofeng

    2009-01-01

    In this paper, we consider the delayed differential equations modeling three-neuron equations with only a time delay. Using the time delay as a bifurcation parameter, necessary and sufficient conditions for Hopf bifurcation to occur are derived. Numerical results indicate that for this model, Hopf bifurcation is likely to occur at suitable delay parameter values.

  17. The autonomous house: a bio-hydrogen based energy self-sufficient approach.

    Science.gov (United States)

    Chen, Shang-Yuan; Chu, Chen-Yeon; Cheng, Ming-Jen; Lin, Chiu-Yue

    2009-04-01

    In the wake of the greenhouse effect and global energy crisis, finding sources of clean, alternative energy and developing everyday life applications have become urgent tasks. This study proposes the development of an "autonomous house" emphasizing the use of modern green energy technology to reduce environmental load, achieve energy autonomy and use energy intelligently in order to create a sustainable, comfortable living environment. The houses' two attributes are: (1) a self-sufficient energy cycle and (2) autonomous energy control to maintain environmental comfort. The autonomous house thus combines energy-conserving, carbon emission-reducing passive design with active elements needed to maintain a comfortable environment.

  18. The Autonomous House: A Bio-Hydrogen Based Energy Self-Sufficient Approach

    Science.gov (United States)

    Chen, Shang-Yuan; Chu, Chen-Yeon; Cheng, Ming-jen; Lin, Chiu-Yue

    2009-01-01

    In the wake of the greenhouse effect and global energy crisis, finding sources of clean, alternative energy and developing everyday life applications have become urgent tasks. This study proposes the development of an “autonomous house” emphasizing the use of modern green energy technology to reduce environmental load, achieve energy autonomy and use energy intelligently in order to create a sustainable, comfortable living environment. The houses’ two attributes are: (1) a self-sufficient energy cycle and (2) autonomous energy control to maintain environmental comfort. The autonomous house thus combines energy-conserving, carbon emission-reducing passive design with active elements needed to maintain a comfortable environment. PMID:19440531

  19. Statistical theory a concise introduction

    CERN Document Server

    Abramovich, Felix

    2013-01-01

    Introduction Preamble Likelihood Sufficiency Minimal sufficiency Completeness Exponential family of distributionsPoint Estimation Introduction Maximum likelihood estimation Method of moments Method of least squares Goodness-of-estimation. Mean squared error. Unbiased estimationConfidence Intervals, Bounds, and Regions Introduction Quoting the estimation error Confidence intervalsConfidence bounds Confidence regionsHypothesis Testing Introduction Simple hypothesesComposite hypothesesHypothesis testing and confidence intervals Sequential testingAsymptotic Analysis Introduction Convergence and consistency in MSE Convergence and consistency in probability Convergence in distribution The central limit theorem Asymptotically normal consistency Asymptotic confidence intervals Asymptotic normality of the MLE Multiparameter case Asymptotic distribution of the GLRT. Wilks' theorem.Bayesian Inference Introduction Choice of priors Point estimation Interval estimation. Credible sets. Hypothesis testingElements of Statisti...

  20. The sufficient condition for an extremum in the classical action integral as an eingenvalue problem

    International Nuclear Information System (INIS)

    Hussein, M.S.; Pereira, J.G.

    The sufficient condition for an extremum in the classical action integral is studied using Morse's theory. Applications to the classical harmonic and anharmonic oscillators are made. The analogy of the calculations to the quantum mechanical problems in one dimension is stressed. (Author) [pt

  1. Scheduling of Crude Oil Operations in Refinery without Sufficient Charging Tanks Using Petri Nets

    Directory of Open Access Journals (Sweden)

    Yan An

    2017-05-01

    Full Text Available A short-term schedule for crude oil operations in a refinery should define and sequence the activities in detail. Each activity involves both discrete-event and continuous variables. The combinatorial nature of the scheduling problem makes it difficult to solve. For such a scheduling problem, charging tanks are a type of critical resources. If the number of charging tanks is not sufficient, the scheduling problem is further complicated. This work conducts a study on the scheduling problem of crude oil operations without sufficient charging tanks. In this case, to make a refinery able to operate, a charging tank has to be in simultaneous charging and feeding to a distiller for some time, called simultaneously-charging-and-feeding (SCF mode, leading to disturbance to the oil distillation in distillers. A hybrid Petri net model is developed to describe the behavior of the system. Then, a scheduling method is proposed to find a schedule such that the SCF mode is minimally used. It is computationally efficient. An industrial case study is given to demonstrate the obtained results.

  2. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  3. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  4. Statistics at a glance.

    Science.gov (United States)

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  5. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  6. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  7. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  8. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  9. [The concept of nutritional self-sufficiency and the demographic equilibrium of Rwanda].

    Science.gov (United States)

    Habimana Nyirasafari, G

    1987-12-01

    Achieving food self-sufficiency is the basic strategy of Rwanda's 4th 5-year plan covering 1987-91. The population growth rate has increased from 3% in 1970 to 3.7% in 1983, with the population doubling between 1964 and 1985. Food production grew by about 4%/year between 1966-83, creating a slight increase in per capita food availability, but the 2171 calories available per capita is dangerously close to the theoretical minimum requirement of 2100 per day. The theoretical protein requirement is almost covered, but there is a serious shortage of oils. The increase in production since 1966 has been due almost exclusively to the extension of cultivated land. But the land supply is limited, and future production increases will need to be based on increased yields per unit cultivated. The National Office of Population has developed a simulation model that analyzes the parallel evolution of population and production so as to identify demographic and development policies that will assure food self-sufficiency and an improvement in living conditions. The population subsystem subjects the population divided by age and sex to the effects of fertility, migration, and mortality. Births are the result of 36 different fertility rates applied to the population of women aged 14-49 years. The agricultural subsystem is tied to the population subsystem by comparison of the volume of population to that of production, by estimation of the proportion of the population living exclusively by subsistence agriculture, by calculation of the potential emigration resulting from overpopulation of the countryside, and by estimation of the links between nutritional level, mortality, and duration of breastfeeding. 5 annexes contain subsystems showing effects of demographic growth on education, employment, and health. The model has various limitations including those of the reliability of its data, but it is sufficiently precise for its main function of clarifying the choices facing policymakers. 6

  10. Youth Sports Safety Statistics

    Science.gov (United States)

    ... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...

  11. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  12. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  13. Comparison of Tsallis statistics with the Tsallis-factorized statistics in the ultrarelativistic pp collisions

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2016-01-01

    The Tsallis statistics was applied to describe the experimental data on the transverse momentum distributions of hadrons. We considered the energy dependence of the parameters of the Tsallis-factorized statistics, which is now widely used for the description of the experimental transverse momentum distributions of hadrons, and the Tsallis statistics for the charged pions produced in pp collisions at high energies. We found that the results of the Tsallis-factorized statistics deviate from the results of the Tsallis statistics only at low NA61/SHINE energies when the value of the entropic parameter is close to unity. At higher energies, when the value of the entropic parameter deviates essentially from unity, the Tsallis-factorized statistics satisfactorily recovers the results of the Tsallis statistics. (orig.)

  14. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  15. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  16. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  17. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    Science.gov (United States)

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  18. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  19. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  20. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  1. Modern applied U-statistics

    CERN Document Server

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  2. PPARalpha siRNA-treated expression profiles uncover the causal sufficiency network for compound-induced liver hypertrophy.

    Directory of Open Access Journals (Sweden)

    Xudong Dai

    2007-03-01

    Full Text Available Uncovering pathways underlying drug-induced toxicity is a fundamental objective in the field of toxicogenomics. Developing mechanism-based toxicity biomarkers requires the identification of such novel pathways and the order of their sufficiency in causing a phenotypic response. Genome-wide RNA interference (RNAi phenotypic screening has emerged as an effective tool in unveiling the genes essential for specific cellular functions and biological activities. However, eliciting the relative contribution of and sufficiency relationships among the genes identified remains challenging. In the rodent, the most widely used animal model in preclinical studies, it is unrealistic to exhaustively examine all potential interactions by RNAi screening. Application of existing computational approaches to infer regulatory networks with biological outcomes in the rodent is limited by the requirements for a large number of targeted permutations. Therefore, we developed a two-step relay method that requires only one targeted perturbation for genome-wide de novo pathway discovery. Using expression profiles in response to small interfering RNAs (siRNAs against the gene for peroxisome proliferator-activated receptor alpha (Ppara, our method unveiled the potential causal sufficiency order network for liver hypertrophy in the rodent. The validity of the inferred 16 causal transcripts or 15 known genes for PPARalpha-induced liver hypertrophy is supported by their ability to predict non-PPARalpha-induced liver hypertrophy with 84% sensitivity and 76% specificity. Simulation shows that the probability of achieving such predictive accuracy without the inferred causal relationship is exceedingly small (p < 0.005. Five of the most sufficient causal genes have been previously disrupted in mouse models; the resulting phenotypic changes in the liver support the inferred causal roles in liver hypertrophy. Our results demonstrate the feasibility of defining pathways mediating drug

  3. High-dimensional statistical inference: From vector to matrix

    Science.gov (United States)

    Zhang, Anru

    Statistical inference for sparse signals or low-rank matrices in high-dimensional settings is of significant interest in a range of contemporary applications. It has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. In this thesis, we consider several problems in including sparse signal recovery (compressed sensing under restricted isometry) and low-rank matrix recovery (matrix recovery via rank-one projections and structured matrix completion). The first part of the thesis discusses compressed sensing and affine rank minimization in both noiseless and noisy cases and establishes sharp restricted isometry conditions for sparse signal and low-rank matrix recovery. The analysis relies on a key technical tool which represents points in a polytope by convex combinations of sparse vectors. The technique is elementary while leads to sharp results. It is shown that, in compressed sensing, delta kA 0, delta kA < 1/3 + epsilon, deltak A + thetak,kA < 1 + epsilon, or deltatkA< √(t - 1) / t + epsilon are not sufficient to guarantee the exact recovery of all k-sparse signals for large k. Similar result also holds for matrix recovery. In addition, the conditions delta kA<1/3, deltak A+ thetak,kA<1, delta tkA < √(t - 1)/t and deltarM<1/3, delta rM+ thetar,rM<1, delta trM< √(t - 1)/ t are also shown to be sufficient respectively for stable recovery of approximately sparse signals and low-rank matrices in the noisy case. For the second part of the thesis, we introduce a rank-one projection model for low-rank matrix recovery and propose a constrained nuclear norm minimization method for stable recovery of low-rank matrices in the noisy case. The procedure is adaptive to the rank and robust against small perturbations. Both upper and lower bounds for the estimation accuracy under the Frobenius norm loss are obtained. The proposed estimator is shown to be rate-optimal under certain conditions. The

  4. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  5. Bayesian-statistical decision threshold, detection limit, and confidence interval in nuclear radiation measurement

    International Nuclear Information System (INIS)

    Weise, K.

    1998-01-01

    When a contribution of a particular nuclear radiation is to be detected, for instance, a spectral line of interest for some purpose of radiation protection, and quantities and their uncertainties must be taken into account which, such as influence quantities, cannot be determined by repeated measurements or by counting nuclear radiation events, then conventional statistics of event frequencies is not sufficient for defining the decision threshold, the detection limit, and the limits of a confidence interval. These characteristic limits are therefore redefined on the basis of Bayesian statistics for a wider applicability and in such a way that the usual practice remains as far as possible unaffected. The principle of maximum entropy is applied to establish probability distributions from available information. Quantiles of these distributions are used for defining the characteristic limits. But such a distribution must not be interpreted as a distribution of event frequencies such as the Poisson distribution. It rather expresses the actual state of incomplete knowledge of a physical quantity. The different definitions and interpretations and their quantitative consequences are presented and discussed with two examples. The new approach provides a theoretical basis for the DIN 25482-10 standard presently in preparation for general applications of the characteristic limits. (orig.) [de

  6. Statistical Monitoring of Innovation Capacities of the Serbian Firms as Decision- Making Tool

    Directory of Open Access Journals (Sweden)

    Marija Mosurović Ružičić

    2016-12-01

    Full Text Available The subject of this paper is to underline the importance of using data obtained via the official statistical reports that is based on Oslo manual methodology manual (Community Innovation Survey for strategic decision making both at the national level as well as at the level of the company. These data enable monitoring and evaluating the innovation capacity of the firms with the aim of improving it. The paper, also, points out the importance of the firm's innovation capacity assessment as an impeller of economic development based on knowledge. By the data obtained by presented methodology, national decision makers can clearly comprehend and improve the direction of innovation policy and its integration into the wider policy framework that encourage economic development based on innovation. At the firm level, the use of data implies development of professional management of the innovative firm that will be able to respond to problem situations of the modern economy through the formulation of appropriate strategies. The paper analyzed data from three statistical periods during which the Oslo manual methodology had been applied in Serbia. Analysis has shown that the data obtained in this way are not sufficiently used by decision-makers an occasion rating innovation capacity of enterprises.

  7. UN Data- Environmental Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  8. UN Data: Environment Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  9. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  10. Energy statistics yearbook 2002

    International Nuclear Information System (INIS)

    2005-01-01

    The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  11. Energy statistics yearbook 2001

    International Nuclear Information System (INIS)

    2004-01-01

    The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  12. Energy statistics yearbook 2000

    International Nuclear Information System (INIS)

    2002-01-01

    The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  13. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  14. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  15. Predator confusion is sufficient to evolve swarming behaviour.

    Science.gov (United States)

    Olson, Randal S; Hintze, Arend; Dyer, Fred C; Knoester, David B; Adami, Christoph

    2013-08-06

    Swarming behaviours in animals have been extensively studied owing to their implications for the evolution of cooperation, social cognition and predator-prey dynamics. An important goal of these studies is discerning which evolutionary pressures favour the formation of swarms. One hypothesis is that swarms arise because the presence of multiple moving prey in swarms causes confusion for attacking predators, but it remains unclear how important this selective force is. Using an evolutionary model of a predator-prey system, we show that predator confusion provides a sufficient selection pressure to evolve swarming behaviour in prey. Furthermore, we demonstrate that the evolutionary effect of predator confusion on prey could in turn exert pressure on the structure of the predator's visual field, favouring the frontally oriented, high-resolution visual systems commonly observed in predators that feed on swarming animals. Finally, we provide evidence that when prey evolve swarming in response to predator confusion, there is a change in the shape of the functional response curve describing the predator's consumption rate as prey density increases. Thus, we show that a relatively simple perceptual constraint--predator confusion--could have pervasive evolutionary effects on prey behaviour, predator sensory mechanisms and the ecological interactions between predators and prey.

  16. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    IAS Admin

    Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.

  17. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  18. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  19. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  20. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  1. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    Science.gov (United States)

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  2. The research protocol VI: How to choose the appropriate statistical test. Inferential statistics

    Directory of Open Access Journals (Sweden)

    Eric Flores-Ruiz

    2017-10-01

    Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  3. Business Statistics Education: Content and Software in Undergraduate Business Statistics Courses.

    Science.gov (United States)

    Tabatabai, Manouchehr; Gamble, Ralph

    1997-01-01

    Survey responses from 204 of 500 business schools identified most often topics in business statistics I and II courses. The most popular software at both levels was Minitab. Most schools required both statistics I and II. (SK)

  4. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  5. Safety bey statistics? A critical view on statistical methods applied in health physics

    International Nuclear Information System (INIS)

    Kraut, W.

    2016-01-01

    The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.

  6. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  7. Generalized $L-, M-$, and $R$-Statistics

    OpenAIRE

    Serfling, Robert J.

    1984-01-01

    A class of statistics generalizing $U$-statistics and $L$-statistics, and containing other varieties of statistic as well, such as trimmed $U$-statistics, is studied. Using the differentiable statistical function approach, differential approximations are obtained and the influence curves of these generalized $L$-statistics are derived. These results are employed to establish asymptotic normality for such statistics. Parallel generalizations of $M$- and $R$-statistics are noted. Strong converg...

  8. Infant Statistical Learning

    Science.gov (United States)

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  9. Community Service and University Roles: An Action Research Based on the Philosophy of Sufficiency Economy

    Science.gov (United States)

    Nuangchalerm, Prasart; Chansirisira, Pacharawit

    2012-01-01

    This study employs action research to develop community service through university roles by applying the philosophy of sufficiency economy of His Majesty the King Bhumibol Adulyadej to fulfill villagers' way of life. Participatory learning, seminar, field trip and supervision were employed for strategic plan. Data were collected by participatory…

  10. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  11. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  12. Philosophy of sufficiency economy for community-based adaptation to climate change: Lessons learned from Thai case studies

    Directory of Open Access Journals (Sweden)

    Kulvadee Kansuntisukmongkol

    2017-01-01

    Full Text Available Major components within the philosophy of a sufficiency economy include moderation, prudence, and self-immunity together with knowledge and morality. These components were proposed to safeguard local communities from adverse changes and crises. Climatic crises due to global warming can impact upon local agricultural production and consumption systems. Yet, it is still questionable whether communities following the sufficiency economy philosophy can cope with climate change. The objective of this research was to study the coping and adaptive capacity to climate change of local agricultural communities following the sufficiency economy philosophy and to analyze the success factors of adaptation to climate change. The research found five adaptive strategies leading to a resilient livelihood: (1 self-evaluation, (2 diversity dependency, (3 storage and reserve, (4 cooperation, and (5 mobility over space and time. These strategies help to reduce exposure and sensitivity, while increasing adaptive capacity to climate change with the aims of sustainability and adaptation for survival, and protecting natural resource bases for food and settlement security. Moderation, prudence, and self-immunity are critical success factors of adaptation measures, whereas local ecological knowledge with morality is a core enabling factor for adapting to climate change. These factors can be applied in community-based climate change adaptation in the National Adaptation Plan.

  13. Children's Sleep Needs: Is There Sufficient Evidence to Recommend Optimal Sleep for Children?

    OpenAIRE

    Matricciani, Lisa; Blunden, Sarah; Rigney, Gabrielle; Williams, Marie T.; Olds, Tim S.

    2013-01-01

    It is widely recognized that sleep is important for children's health and well-being and that short sleep duration is associated with a wide range of negative health outcomes. Recently, there has been much interest in whether or not there are sufficient data to support the specific recommendations made for how much sleep children need. In this article we explore concepts related to children's sleep need, discuss the theory, rationale, and empirical evidence for contemporary sleep recommendati...

  14. Statistical Group Comparison

    CERN Document Server

    Liao, Tim Futing

    2011-01-01

    An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde

  15. Statistical Analysis of Hubble/WFC3 Transit Spectroscopy of Extrasolar Planets

    Science.gov (United States)

    Fu, Guangwei; Deming, Drake; Knutson, Heather; Madhusudhan, Nikku; Mandell, Avi; Fraine, Jonathan

    2018-01-01

    Transmission spectroscopy provides a window to study exoplanetary atmospheres, but that window is fogged by clouds and hazes. Clouds and haze introduce a degeneracy between the strength of gaseous absorption features and planetary physical parameters such as abundances. One way to break that degeneracy is via statistical studies. We collect all published HST/WFC3 transit spectra for 1.1-1.65 micron water vapor absorption, and perform a statistical study on potential correlations between the water absorption feature and planetary parameters. We fit the observed spectra with a template calculated for each planet using the Exo-Transmit code. We express the magnitude of the water absorption in scale heights, thereby removing the known dependence on temperature, surface gravity, and mean molecular weight. We find that the absorption in scale heights has a positive baseline correlation with planetary equilibrium temperature; our hypothesis is that decreasing cloud condensation with increasing temperature is responsible for this baseline slope. However, the observed sample is also intrinsically degenerate in the sense that equilibrium temperature correlates with planetary mass. We compile the distribution of absorption in scale heights, and we find that this distribution is closer to log-normal than Gaussian. However, we also find that the distribution of equilibrium temperatures for the observed planets is similarly log-normal. This indicates that the absorption values are affected by observational bias, whereby observers have not yet targeted a sufficient sample of the hottest planets.

  16. Statistical Analysis of Hubble /WFC3 Transit Spectroscopy of Extrasolar Planets

    Energy Technology Data Exchange (ETDEWEB)

    Fu, Guangwei; Deming, Drake [Department of Astronomy, University of Maryland, College Park, MD 20742 (United States); Knutson, Heather [Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, CA 91125 (United States); Madhusudhan, Nikku [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Mandell, Avi [Planetary Systems Laboratory, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Fraine, Jonathan, E-mail: gfu@astro.umd.edu [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States)

    2017-10-01

    Transmission spectroscopy provides a window to study exoplanetary atmospheres, but that window is fogged by clouds and hazes. Clouds and haze introduce a degeneracy between the strength of gaseous absorption features and planetary physical parameters such as abundances. One way to break that degeneracy is via statistical studies. We collect all published HST /WFC3 transit spectra for 1.1–1.65 μ m water vapor absorption and perform a statistical study on potential correlations between the water absorption feature and planetary parameters. We fit the observed spectra with a template calculated for each planet using the Exo-transmit code. We express the magnitude of the water absorption in scale heights, thereby removing the known dependence on temperature, surface gravity, and mean molecular weight. We find that the absorption in scale heights has a positive baseline correlation with planetary equilibrium temperature; our hypothesis is that decreasing cloud condensation with increasing temperature is responsible for this baseline slope. However, the observed sample is also intrinsically degenerate in the sense that equilibrium temperature correlates with planetary mass. We compile the distribution of absorption in scale heights, and we find that this distribution is closer to log-normal than Gaussian. However, we also find that the distribution of equilibrium temperatures for the observed planets is similarly log-normal. This indicates that the absorption values are affected by observational bias, whereby observers have not yet targeted a sufficient sample of the hottest planets.

  17. Statistical Analysis of Hubble /WFC3 Transit Spectroscopy of Extrasolar Planets

    International Nuclear Information System (INIS)

    Fu, Guangwei; Deming, Drake; Knutson, Heather; Madhusudhan, Nikku; Mandell, Avi; Fraine, Jonathan

    2017-01-01

    Transmission spectroscopy provides a window to study exoplanetary atmospheres, but that window is fogged by clouds and hazes. Clouds and haze introduce a degeneracy between the strength of gaseous absorption features and planetary physical parameters such as abundances. One way to break that degeneracy is via statistical studies. We collect all published HST /WFC3 transit spectra for 1.1–1.65 μ m water vapor absorption and perform a statistical study on potential correlations between the water absorption feature and planetary parameters. We fit the observed spectra with a template calculated for each planet using the Exo-transmit code. We express the magnitude of the water absorption in scale heights, thereby removing the known dependence on temperature, surface gravity, and mean molecular weight. We find that the absorption in scale heights has a positive baseline correlation with planetary equilibrium temperature; our hypothesis is that decreasing cloud condensation with increasing temperature is responsible for this baseline slope. However, the observed sample is also intrinsically degenerate in the sense that equilibrium temperature correlates with planetary mass. We compile the distribution of absorption in scale heights, and we find that this distribution is closer to log-normal than Gaussian. However, we also find that the distribution of equilibrium temperatures for the observed planets is similarly log-normal. This indicates that the absorption values are affected by observational bias, whereby observers have not yet targeted a sufficient sample of the hottest planets.

  18. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  19. Exclusion statistics and integrable models

    International Nuclear Information System (INIS)

    Mashkevich, S.

    1998-01-01

    The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open

  20. The Relationship Between Maternal Attachment, Perceived Social Support and Breast-Feeding Sufficiency

    International Nuclear Information System (INIS)

    Cinar, N.; Kose, D.; Altinkaynak, S.

    2015-01-01

    Objective: To determine the relationship between maternal attachment, perceived social support and breast-feeding sufficiency. Study Design: Descriptive correlational design. Place and Duration of Study: A state hospital and two family health centers in Sakarya, Turkey, between June to December 2011. Methodology: The sample was 122 voluntary mothers who had healthy babies of 1 - 2 months old. The data were collected by a Personal Information Form, Maternal Attachment Inventory (MAI), Multidimensional Scale of Perceived Social Support (MSPSS) and Breast-feeding Self-Efficacy Scale-Short Form (BSES-SF). The data collected were analysed by percentage distribution, mean square, independent sample t-test, Mann-Whitney U, Kruskall-Wallis and Pearson correlation. Results: The mean age of the mothers was 25.01 ± 2.2 years, and 48.4% of them were primary school graduates. BSESSF was 61.02 ± 8.44 (16 - 70), MAI was 99.07 ± 7.19 (56 - 100) and MSPSS was 66.40 ± 13.58 (37 - 84). There was a positive, medium level, significant relationship between the total scores of BSES-SF and MAI (r=0.370, p < 0.001). There was a positive, medium level, significant relationship between the total score of BSES-SF and the score from family subdimension of MSPSS (r=0.255, p < 0.01).There was a positive, medium level, significant relationship between the total score of MAI and the total score of family subdimension of MSPSS (r=0.339, p < 0.001). Conclusion: Successful maternal attachment and familial support positively affected the breast-feeding sufficiency of the mother. (author)

  1. Bridging Expectations: Extension Agents’ Perception of a Gap between Expectations and Experience when Implementing the Indonesian Beef Self-Sufficiency Programme

    DEFF Research Database (Denmark)

    Gayatri, Siwi; Vaarst, Mette

    2016-01-01

    Beef self-sufficiency programme (BSSP) was launched in Indonesia in 2004 in response to the massive import of beef from other countries. The objective of the present article is to explore and discuss how Indonesian extension agents perceived the practical implementation of the programme, including...... of the programme, the future of self-sufficiency regarding beef production in the country, and how this learning could be captured and used for the future....

  2. Fabrication of Hyperbranched Block-Statistical Copolymer-Based Prodrug with Dual Sensitivities for Controlled Release.

    Science.gov (United States)

    Zheng, Luping; Wang, Yunfei; Zhang, Xianshuo; Ma, Liwei; Wang, Baoyan; Ji, Xiangling; Wei, Hua

    2018-01-17

    Dendrimer with hyperbranched structure and multivalent surface is regarded as one of the most promising candidates close to the ideal drug delivery systems, but the clinical translation and scale-up production of dendrimer has been hampered significantly by the synthetic difficulties. Therefore, there is considerable scope for the development of novel hyperbranched polymer that can not only address the drawbacks of dendrimer but maintain its advantages. The reversible addition-fragmentation chain transfer self-condensing vinyl polymerization (RAFT-SCVP) technique has enabled facile preparation of segmented hyperbranched polymer (SHP) by using chain transfer monomer (CTM)-based double-head agent during the past decade. Meanwhile, the design and development of block-statistical copolymers has been proven in our recent studies to be a simple yet effective way to address the extracellular stability vs intracellular high delivery efficacy dilemma. To integrate the advantages of both hyperbranched and block-statistical structures, we herein reported the fabrication of hyperbranched block-statistical copolymer-based prodrug with pH and reduction dual sensitivities using RAFT-SCVP and post-polymerization click coupling. The external homo oligo(ethylene glycol methyl ether methacrylate) (OEGMA) block provides sufficient extracellularly colloidal stability for the nanocarriers by steric hindrance, and the interior OEGMA units incorporated by the statistical copolymerization promote intracellular drug release by facilitating the permeation of GSH and H + for the cleavage of the reduction-responsive disulfide bond and pH-liable carbonate link as well as weakening the hydrophobic encapsulation of drug molecules. The delivery efficacy of the target hyperbranched block-statistical copolymer-based prodrug was evaluated in terms of in vitro drug release and cytotoxicity studies, which confirms both acidic pH and reduction-triggered drug release for inhibiting proliferation of He

  3. Parameter sampling capabilities of sequential and simultaneous data assimilation: II. Statistical analysis of numerical results

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess and compare parameter sampling capabilities of one sequential and one simultaneous Bayesian, ensemble-based, joint state-parameter (JS) estimation method. In the companion paper, part I (Fossum and Mannseth 2014 Inverse Problems 30 114002), analytical investigations lead us to propose three claims, essentially stating that the sequential method can be expected to outperform the simultaneous method for weakly nonlinear forward models. Here, we assess the reliability and robustness of these claims through statistical analysis of results from a range of numerical experiments. Samples generated by the two approximate JS methods are compared to samples from the posterior distribution generated by a Markov chain Monte Carlo method, using four approximate measures of distance between probability distributions. Forward-model nonlinearity is assessed from a stochastic nonlinearity measure allowing for sufficiently large model dimensions. Both toy models (with low computational complexity, and where the nonlinearity is fairly easy to control) and two-phase porous-media flow models (corresponding to down-scaled versions of problems to which the JS methods have been frequently applied recently) are considered in the numerical experiments. Results from the statistical analysis show strong support of all three claims stated in part I. (paper)

  4. Energy self-sufficient sewage wastewater treatment plants: is optimized anaerobic sludge digestion the key?

    Science.gov (United States)

    Jenicek, P; Kutil, J; Benes, O; Todt, V; Zabranska, J; Dohanyos, M

    2013-01-01

    The anaerobic digestion of primary and waste activated sludge generates biogas that can be converted into energy to power the operation of a sewage wastewater treatment plant (WWTP). But can the biogas generated by anaerobic sludge digestion ever completely satisfy the electricity requirements of a WWTP with 'standard' energy consumption (i.e. industrial pollution not treated, no external organic substrate added)? With this question in mind, we optimized biogas production at Prague's Central Wastewater Treatment Plant in the following ways: enhanced primary sludge separation; thickened waste activated sludge; implemented a lysate centrifuge; increased operational temperature; improved digester mixing. With these optimizations, biogas production increased significantly to 12.5 m(3) per population equivalent per year. In turn, this led to an equally significant increase in specific energy production from approximately 15 to 23.5 kWh per population equivalent per year. We compared these full-scale results with those obtained from WWTPs that are already energy self-sufficient, but have exceptionally low energy consumption. Both our results and our analysis suggest that, with the correct optimization of anaerobic digestion technology, even WWTPs with 'standard' energy consumption can either attain or come close to attaining energy self-sufficiency.

  5. Cancer Statistics Animator

    Science.gov (United States)

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  6. Sufficient condition for confinement of static quarks by a vortex condensation mechanism

    International Nuclear Information System (INIS)

    Mack, G.; Petkova, V.B.

    1978-11-01

    We derive a sufficient condition for confinement of static quarks by a vortex condensation mechanism. It admits vortices that are thick at all times at the cost of constraining them to a finite volume Λi whose complement is not simply connected. The confining potential V(L) is estimated in terms of the change of free energy of a system enclosed in Λi which is induced by a change in vorticity (= singular gauge transformation applied to boundary conditions on deltaΛi). For Abelian gauge theories in 3 dimensions the confining Coulomb potential is reproduced as a lower bound. (orig.) [de

  7. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  8. On necessary and sufficient conditions for some Higgs potentials to be bounded from below

    International Nuclear Information System (INIS)

    Klimenko, K.G.

    1984-01-01

    The necessary and sufficient (NS) conditions have been obtained to make the Higgs potentials be bounded from below. Here these potentials are constructed from: (i) two doublets, as well as two doublets and a singlet of SU(2)-group; (ii) adjoint and vector representations of SO(n). For the potential constructed from the adjoint and fundamental SU(n) multiplets, the problem of NS conditions is solved partially

  9. Evaluating the sufficiency of protected lands for maintaining wildlife population connectivity in the northern Rocky Mountains

    Science.gov (United States)

    Samuel A. Cushman; Erin L. Landguth; Curtis H. Flather

    2012-01-01

    Aim: The goal of this study was to evaluate the sufficiency of the network of protected lands in the U.S. northern Rocky Mountains in providing protection for habitat connectivity for 105 hypothetical organisms. A large proportion of the landscape...

  10. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  11. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  12. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  13. Advances in statistics

    Science.gov (United States)

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  14. THE GROWTH POINTS OF STATISTICAL METHODS

    OpenAIRE

    Orlov A. I.

    2014-01-01

    On the basis of a new paradigm of applied mathematical statistics, data analysis and economic-mathematical methods are identified; we have also discussed five topical areas in which modern applied statistics is developing as well as the other statistical methods, i.e. five "growth points" – nonparametric statistics, robustness, computer-statistical methods, statistics of interval data, statistics of non-numeric data

  15. The Euclid Statistical Matrix Tool

    Directory of Open Access Journals (Sweden)

    Curtis Tilves

    2017-06-01

    Full Text Available Stataphobia, a term used to describe the fear of statistics and research methods, can result from a lack of improper training in statistical methods. Poor statistical methods training can have an effect on health policy decision making and may play a role in the low research productivity seen in developing countries. One way to reduce Stataphobia is to intervene in the teaching of statistics in the classroom; however, such an intervention must tackle several obstacles, including student interest in the material, multiple ways of learning materials, and language barriers. We present here the Euclid Statistical Matrix, a tool for combatting Stataphobia on a global scale. This free tool is comprised of popular statistical YouTube channels and web sources that teach and demonstrate statistical concepts in a variety of presentation methods. Working with international teams in Iran, Japan, Egypt, Russia, and the United States, we have also developed the Statistical Matrix in multiple languages to address language barriers to learning statistics. By utilizing already-established large networks, we are able to disseminate our tool to thousands of Farsi-speaking university faculty and students in Iran and the United States. Future dissemination of the Euclid Statistical Matrix throughout the Central Asia and support from local universities may help to combat low research productivity in this region.

  16. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  17. Basics of modern mathematical statistics

    CERN Document Server

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  18. [''R"--project for statistical computing

    DEFF Research Database (Denmark)

    Dessau, R.B.; Pipper, Christian Bressen

    2008-01-01

    An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...

  19. Statistical tables 2003

    International Nuclear Information System (INIS)

    2003-01-01

    The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)

  20. Statistical reporting errors and collaboration on statistical analyses in psychological science

    NARCIS (Netherlands)

    Veldkamp, C.L.S.; Nuijten, M.B.; Dominguez Alvarez, L.; van Assen, M.A.L.M.; Wicherts, J.M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we

  1. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.

  2. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  3. Cancer Data and Statistics Tools

    Science.gov (United States)

    ... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...

  4. Lancaster Postgraduate Statistics Centre – creating enterprise and innovation in teaching statistics across disciplines.

    OpenAIRE

    Lancaster, Gillian; Francis, Brian; Allen, Ruth

    2009-01-01

    The Lancaster Postgraduate Statistics Centre (PSC) encompasses all aspects of Postgraduate Teaching and Learning within the Mathematics and Statistics department. It is the only UK HEFCE-funded Centre for Excellence in Teaching and Learning that uniquely specialises in postgraduate statistics, and rewards the research and teaching excellence of the Statistics Group. The award-winning purpose-built PSC building opened in February 2008, and features many modern state of the art facilities. Our ...

  5. Order-specific fertility estimates based on perinatal statistics and statistics on out-of-hospital births

    OpenAIRE

    Kreyenfeld, Michaela; Peters, Frederik; Scholz, Rembrandt; Wlosnewski, Ines

    2014-01-01

    Until 2008, German vital statistics has not provided information on biological birth order. We have tried to close part of this gap by providing order-specific fertility rates generated from Perinatal Statistics and statistics on out-of-hospital births for the period 2001-2008. This investigation has been published in Comparative Population Studies (CPoS) (see Kreyenfeld, Scholz, Peters and Wlosnewski 2010). The CPoS-paper describes how data from the Perinatal Statistics and statistics on out...

  6. Exclusion statistics and integrable models

    International Nuclear Information System (INIS)

    Mashkevich, S.

    1998-01-01

    The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question

  7. Mathematical statistics and stochastic processes

    CERN Document Server

    Bosq, Denis

    2013-01-01

    Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

  8. Two independent pivotal statistics that test location and misspecification and add-up to the Anderson-Rubin statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2002-01-01

    We extend the novel pivotal statistics for testing the parameters in the instrumental variables regression model. We show that these statistics result from a decomposition of the Anderson-Rubin statistic into two independent pivotal statistics. The first statistic is a score statistic that tests

  9. Domain IV voltage-sensor movement is both sufficient and rate limiting for fast inactivation in sodium channels.

    Science.gov (United States)

    Capes, Deborah L; Goldschen-Ohm, Marcel P; Arcisio-Miranda, Manoel; Bezanilla, Francisco; Chanda, Baron

    2013-08-01

    Voltage-gated sodium channels are critical for the generation and propagation of electrical signals in most excitable cells. Activation of Na(+) channels initiates an action potential, and fast inactivation facilitates repolarization of the membrane by the outward K(+) current. Fast inactivation is also the main determinant of the refractory period between successive electrical impulses. Although the voltage sensor of domain IV (DIV) has been implicated in fast inactivation, it remains unclear whether the activation of DIV alone is sufficient for fast inactivation to occur. Here, we functionally neutralize each specific voltage sensor by mutating several critical arginines in the S4 segment to glutamines. We assess the individual role of each voltage-sensing domain in the voltage dependence and kinetics of fast inactivation upon its specific inhibition. We show that movement of the DIV voltage sensor is the rate-limiting step for both development and recovery from fast inactivation. Our data suggest that activation of the DIV voltage sensor alone is sufficient for fast inactivation to occur, and that activation of DIV before channel opening is the molecular mechanism for closed-state inactivation. We propose a kinetic model of sodium channel gating that can account for our major findings over a wide voltage range by postulating that DIV movement is both necessary and sufficient for fast inactivation.

  10. Statistics in the pharmacy literature.

    Science.gov (United States)

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  11. Freeze-thaw stress of Alhydrogel ® alone is sufficient to reduce the immunogenicity of a recombinant hepatitis B vaccine containing native antigen.

    Science.gov (United States)

    Clapp, Tanya; Munks, Michael W; Trivedi, Ruchit; Kompella, Uday B; Braun, LaToya Jones

    2014-06-24

    Preventing losses in vaccine potency due to accidental freezing has recently become a topic of interest for improving vaccines. All vaccines with aluminum-containing adjuvants are susceptible to such potency losses. Recent studies have described excipients that protect the antigen from freeze-induced inactivation, prevent adjuvant agglomeration and retain potency. Although these strategies have demonstrated success, they do not provide a mechanistic understanding of freeze-thaw (FT) induced potency losses. In the current study, we investigated how adjuvant frozen in the absence of antigen affects vaccine immunogenicity and whether preventing damage to the freeze-sensitive recombinant hepatitis B surface antigen (rHBsAg) was sufficient for maintaining vaccine potency. The final vaccine formulation or Alhydrogel(®) alone was subjected to three FT-cycles. The vaccines were characterized for antigen adsorption, rHBsAg tertiary structure, particle size and charge, adjuvant elemental content and in-vivo potency. Particle agglomeration of either vaccine particles or adjuvant was observed following FT-stress. In vivo studies demonstrated no statistical differences in IgG responses between vaccines with FT-stressed adjuvant and no adjuvant. Adsorption of rHBsAg was achieved; regardless of adjuvant treatment, suggesting that the similar responses were not due to soluble antigen in the frozen adjuvant-containing formulations. All vaccines with adjuvant, including the non-frozen controls, yielded similar, blue-shifted fluorescence emission spectra. Immune response differences could not be traced to differences in the tertiary structure of the antigen in the formulations. Zeta potential measurements and elemental content analyses suggest that FT-stress resulted in a significant chemical alteration of the adjuvant surface. This data provides evidence that protecting a freeze-labile antigen from subzero exposure is insufficient to maintain vaccine potency. Future studies should

  12. Students' Perceptions of Statistics: An Exploration of Attitudes, Conceptualizations, and Content Knowledge of Statistics

    Science.gov (United States)

    Bond, Marjorie E.; Perkins, Susan N.; Ramirez, Caroline

    2012-01-01

    Although statistics education research has focused on students' learning and conceptual understanding of statistics, researchers have only recently begun investigating students' perceptions of statistics. The term perception describes the overlap between cognitive and non-cognitive factors. In this mixed-methods study, undergraduate students…

  13. Statistical concepts a second course

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes

  14. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  15. Resource use and food self-sufficiency at farm scale within two agro-ecological zones of Rwanda

    NARCIS (Netherlands)

    Bucagu, C.; Vanlauwe, B.; Wijk, van M.T.; Giller, K.E.

    2014-01-01

    Resource use and management are major determinants of the food self-sufficiency of smallholder farmers in sub-Saharan Africa. A study was conducted in Rwanda in two contrasting agro-ecological zones (Central plateau and Buberuka) to characterise farms, quantify their resource flows, and evaluate the

  16. Modelling cereal crops to assess future climate risk for family food self-sufficiency in southern Mali

    NARCIS (Netherlands)

    Traore, Bouba; Descheemaeker, Katrien; Wijk, van Mark T.; Corbeels, Marc; Supit, Iwan; Giller, Ken E.

    2017-01-01

    Future climate change will have far reaching consequences for smallholder farmers in sub-Saharan Africa, the majority of whom depend on agriculture for their livelihoods. Here we assessed the farm-level impact of climate change on family food self-sufficiency and evaluated potential adaptation

  17. Basic elements of computational statistics

    CERN Document Server

    Härdle, Wolfgang Karl; Okhrin, Yarema

    2017-01-01

    This textbook on computational statistics presents tools and concepts of univariate and multivariate statistical data analysis with a strong focus on applications and implementations in the statistical software R. It covers mathematical, statistical as well as programming problems in computational statistics and contains a wide variety of practical examples. In addition to the numerous R sniplets presented in the text, all computer programs (quantlets) and data sets to the book are available on GitHub and referred to in the book. This enables the reader to fully reproduce as well as modify and adjust all examples to their needs. The book is intended for advanced undergraduate and first-year graduate students as well as for data analysts new to the job who would like a tour of the various statistical tools in a data analysis workshop. The experienced reader with a good knowledge of statistics and programming might skip some sections on univariate models and enjoy the various mathematical roots of multivariate ...

  18. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  19. On necessity and sufficiency in counseling and psychotherapy (revisited).

    Science.gov (United States)

    Lazarus, Arnold A

    2007-09-01

    It seems to me that Carl Rogers (see record 2007-14639-002) was far too ambitious in trying to specify general conditions of necessity and sufficiency that would be relevant to the entire spectrum of problems and the diverse expectancies and personalities of the people who seek our help. Rogers' position and orientation almost totally overlook the array of problems under the rubric of "response deficits" that stem from misinformation and missing information and call for active correction, training, and retraining. Rogers also paid scant attention to problems with significant biological determinants. Nevertheless, as exemplified by his seminal 1957 article and many other articles and books, Rogers made major contributions within the domain of the therapeutic alliance. Today, the scientific emphasis looks at accountability, the need to establish various treatments of choice, and the need to understand their presumed mechanisms. Treatment efficacy and generalizability across different methodologies are now considered key issues. The efficacy narrowing and clinically self-limiting consequences of adhering to one particular school of thought are now self-evident to most. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  20. Experimental statistics for biological sciences.

    Science.gov (United States)

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  1. Hindbrain ghrelin receptor signaling is sufficient to maintain fasting glucose.

    Directory of Open Access Journals (Sweden)

    Michael M Scott

    Full Text Available The neuronal coordination of metabolic homeostasis requires the integration of hormonal signals with multiple interrelated central neuronal circuits to produce appropriate levels of food intake, energy expenditure and fuel availability. Ghrelin, a peripherally produced peptide hormone, circulates at high concentrations during nutrient scarcity. Ghrelin promotes food intake, an action lost in ghrelin receptor null mice and also helps maintain fasting blood glucose levels, ensuring an adequate supply of nutrients to the central nervous system. To better understand mechanisms of ghrelin action, we have examined the roles of ghrelin receptor (GHSR expression in the mouse hindbrain. Notably, selective hindbrain ghrelin receptor expression was not sufficient to restore ghrelin-stimulated food intake. In contrast, the lowered fasting blood glucose levels observed in ghrelin receptor-deficient mice were returned to wild-type levels by selective re-expression of the ghrelin receptor in the hindbrain. Our results demonstrate the distributed nature of the neurons mediating ghrelin action.

  2. Analyzing the Impacts of Increased Wind Power on Generation Revenue Sufficiency: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin; Wu, Hongyu; Tan, Jin; Hodge, Bri-Mathias; Li, Wanning; Luo, Cheng

    2016-08-01

    The Revenue Sufficiency Guarantee (RSG), as part of make-whole (or uplift) payments in electricity markets, is designed to recover the generation resources' offer-based production costs that are not otherwise covered by their market revenues. Increased penetrations of wind power will bring significant impacts to the RSG payments in the markets. However, literature related to this topic is sparse. This paper first reviews the industrial practices of implementing RSG in major U.S. independent system operators (ISOs) and regional transmission operators (RTOs) and then develops a general RSG calculation method. Finally, an 18-bus test system is adopted to demonstrate the impacts of increased wind power on RSG payments.

  3. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  4. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  5. Practical statistics in pain research.

    Science.gov (United States)

    Kim, Tae Kyun

    2017-10-01

    Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.

  6. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  7. Brain Aneurysm Statistics and Facts

    Science.gov (United States)

    ... Statistics and Facts A- A A+ Brain Aneurysm Statistics and Facts An estimated 6 million people in ... Understanding the Brain Warning Signs/ Symptoms Brain Aneurysm Statistics and Facts Seeking Medical Attention Risk Factors Aneurysm ...

  8. National Center for Health Statistics

    Science.gov (United States)

    ... Submit Search the CDC National Center for Health Statistics Note: Javascript is disabled or is not supported ... Survey of Family Growth Vital Records National Vital Statistics System National Death Index Vital Statistics Rapid Release ...

  9. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  10. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  11. Recreational Boating Statistics 2011

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  12. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  13. Lessons learned in the implementation of Integrated Safety Management at DOE Order Compliance Sites vs Necessary and Sufficient Sites

    International Nuclear Information System (INIS)

    Hill, R.L.

    2000-01-01

    This paper summarizes the development and implementation of Integrated Safety Management (ISM) at an Order Compliance Site (Savannah River Site) and a Necessary and Sufficient Site (Nevada Test Site). A discussion of each core safety function of ISM is followed by an example from an Order Compliance Site and a Necessary and Sufficient Site. The Savannah River Site was the first DOE site to have a DOE Headquarters-validated and approved ISM System. The NTS is beginning the process of verification and validation. This paper defines successful strategies for integrating Environment, Safety, and Health management into work under various scenarios

  14. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  15. A necessary and sufficient condition for the convergence of an AOR iterative method

    International Nuclear Information System (INIS)

    Hu Jiagan

    1992-01-01

    In this paper, a necessary and sufficient condition for the convergence of an AOR iterative method is given under the condition that the coefficient matrix A is consistently ordered and the eigenvalues of the Jacobi matrix of A are all real. With the same method the condition for the convergence of t he extrapolation Gauss-Seidel (EGS) method is also obtained. As an example, the conditions for the model problem are given. The rate of convergence of the EGS method is about twice that of the GS method

  16. INDONESIAN FOOD POLICY: THE PROGRAMS FOR STRENGTHENING FOOD SELF-SUFFICIENCY IN REFORMATION ERA

    Directory of Open Access Journals (Sweden)

    Kamrussamad

    2018-04-01

    Full Text Available The 2012 decree #18, the policy on food states that objective of food implementation is to meet basic human needs and provide fair, equitable, and sustainable benefits based on food sovereignty, food self-sufficiency, and national food security. Food sovereignty, independence and security are fundamental and supports implementation of policies related to food implementation in Indonesia. The 2012 decree #18 stated that food implementation aims to improve ability to produce food independently, provide a variety of food and meet the requirements of security, quality, and nutrition for public consumption.

  17. Mineral industry statistics 1975

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)

  18. A statistical method for model extraction and model selection applied to the temperature scaling of the L–H transition

    International Nuclear Information System (INIS)

    Peluso, E; Gelfusa, M; Gaudio, P; Murari, A

    2014-01-01

    Access to the H mode of confinement in tokamaks is characterized by an abrupt transition, which has been the subject of continuous investigation for decades. Various theoretical models have been developed and multi-machine databases of experimental data have been collected. In this paper, a new methodology is reviewed for the investigation of the scaling laws for the temperature threshold to access the H mode. The approach is based on symbolic regression via genetic programming and allows first the extraction of the most statistically reliable models from the available experimental data. Nonlinear fitting is then applied to the mathematical expressions found by symbolic regression; this second step permits to easily compare the quality of the data-driven scalings with the most widely accepted theoretical models. The application of a complete set of statistical indicators shows that the data-driven scaling laws are qualitatively better than the theoretical models. The main limitations of the theoretical models are that they are all expressed as power laws, which are too rigid to fit the available experimental data and to extrapolate to ITER. The proposed method is absolutely general and can be applied to the extraction or scaling law from any experimental database of sufficient statistical relevance. (paper)

  19. Relation between Statistics of Radiowave Reception at South Pole Station and Auroral Oval Characteristics: Data and Monte Carlo Simulations

    Science.gov (United States)

    Labelle, J.; Noonan, K.

    2006-12-01

    Despite their remote location, radio receivers at South Pole Station regularly detect AM broadcast band signals propagating from transmitters thousands of kilometers away. Statistical analysis of received radiowave power at South Pole during 2004 and 2005, integrated over the frequency range of AM broadcast stations, reveals a distinctive time-of-day (UT) dependence: a broad maximum in received power centered at 1500 UT corresponds to magnetic daytime; signal levels are lower during magnetic nighttime except for a calculated based on two contributions: daytime D-region absorption and auroral absorption. The latter varies with day of year and magnetic local time in a complex fashion due to the asymmetric shape and varying size of the auroral oval and the offset of South Pole from the geomagnetic pole. The Monte Carlo simulations confirm that the enhanced absorption of AM broadcast signals during magnetic nighttime results from auroral absorption. Furthermore, the simulations predict that a weak (<0.5 dB) peak near magnetic midnight, similar to that observed in the data, arises from including in the statistical data base intervals when the auroral oval is contracted. These results suggest that ground based radio observations at a sufficiently remote high-latitude site such as South Pole may effectively monitor auroral oval characteristics on a statistical basis at least.

  20. Statistical analyses to support guidelines for marine avian sampling. Final report

    Science.gov (United States)

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    Interest in development of offshore renewable energy facilities has led to a need for high-quality, statistically robust information on marine wildlife distributions. A practical approach is described to estimate the amount of sampling effort required to have sufficient statistical power to identify species-specific “hotspots” and “coldspots” of marine bird abundance and occurrence in an offshore environment divided into discrete spatial units (e.g., lease blocks), where “hotspots” and “coldspots” are defined relative to a reference (e.g., regional) mean abundance and/or occurrence probability for each species of interest. For example, a location with average abundance or occurrence that is three times larger the mean (3x effect size) could be defined as a “hotspot,” and a location that is three times smaller than the mean (1/3x effect size) as a “coldspot.” The choice of the effect size used to define hot and coldspots will generally depend on a combination of ecological and regulatory considerations. A method is also developed for testing the statistical significance of possible hotspots and coldspots. Both methods are illustrated with historical seabird survey data from the USGS Avian Compendium Database. Our approach consists of five main components: 1. A review of the primary scientific literature on statistical modeling of animal group size and avian count data to develop a candidate set of statistical distributions that have been used or may be useful to model seabird counts. 2. Statistical power curves for one-sample, one-tailed Monte Carlo significance tests of differences of observed small-sample means from a specified reference distribution. These curves show the power to detect "hotspots" or "coldspots" of occurrence and abundance at a range of effect sizes, given assumptions which we discuss. 3. A model selection procedure, based on maximum likelihood fits of models in the candidate set, to determine an appropriate statistical