WorldWideScience

Sample records for leeds chi-squared statistic

  1. Your Chi-Square Test Is Statistically Significant: Now What?

    Science.gov (United States)

    Sharpe, Donald

    2015-01-01

    Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…

  2. Double asymptotics for the chi-square statistic.

    Science.gov (United States)

    Rempała, Grzegorz A; Wesołowski, Jacek

    2016-12-01

    Consider distributional limit of the Pearson chi-square statistic when the number of classes m n increases with the sample size n and [Formula: see text]. Under mild moment conditions, the limit is Gaussian for λ = ∞, Poisson for finite λ > 0, and degenerate for λ = 0.

  3. Filter Tuning Using the Chi-Squared Statistic

    Science.gov (United States)

    Lilly-Salkowski, Tyler

    2017-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Facility (FDF) performs orbit determination (OD) for the Aqua and Aura satellites. Both satellites are located in low Earth orbit (LEO), and are part of what is considered the A-Train satellite constellation. Both spacecraft are currently in the science phase of their respective missions. The FDF has recently been tasked with delivering definitive covariance for each satellite.The main source of orbit determination used for these missions is the Orbit Determination Toolkit developed by Analytical Graphics Inc. (AGI). This software uses an Extended Kalman Filter (EKF) to estimate the states of both spacecraft. The filter incorporates force modelling, ground station and space network measurements to determine spacecraft states. It also generates a covariance at each measurement. This covariance can be useful for evaluating the overall performance of the tracking data measurements and the filter itself. An accurate covariance is also useful for covariance propagation which is utilized in collision avoidance operations. It is also valuable when attempting to determine if the current orbital solution will meet mission requirements in the future.This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The Chi-square statistic is calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance.For the EKF to correctly calculate the covariance, error models associated with tracking data measurements must be accurately tuned. Over estimating or under estimating these error values can have detrimental effects on the overall filter performance. The filter incorporates ground station measurements, which can be tuned based on the accuracy of the individual ground stations. It also includes

  4. Components of the Pearson-Fisher chi-squared statistic

    Directory of Open Access Journals (Sweden)

    G. D. Raynery

    2002-01-01

    interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.

  5. Performance of the S - [chi][squared] Statistic for Full-Information Bifactor Models

    Science.gov (United States)

    Li, Ying; Rupp, Andre A.

    2011-01-01

    This study investigated the Type I error rate and power of the multivariate extension of the S - [chi][squared] statistic using unidimensional and multidimensional item response theory (UIRT and MIRT, respectively) models as well as full-information bifactor (FI-bifactor) models through simulation. Manipulated factors included test length, sample…

  6. FREQFIT: Computer program which performs numerical regression and statistical chi-squared goodness of fit analysis

    International Nuclear Information System (INIS)

    Hofland, G.S.; Barton, C.C.

    1990-01-01

    The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program's results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig

  7. Ensuring Positiveness of the Scaled Difference Chi-square Test Statistic.

    Science.gov (United States)

    Satorra, Albert; Bentler, Peter M

    2010-06-01

    A scaled difference test statistic [Formula: see text] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (2001). The statistic [Formula: see text] is asymptotically equivalent to the scaled difference test statistic T̄(d) introduced in Satorra (2000), which requires more involved computations beyond standard output of SEM software. The test statistic [Formula: see text] has been widely used in practice, but in some applications it is negative due to negativity of its associated scaling correction. Using the implicit function theorem, this note develops an improved scaling correction leading to a new scaled difference statistic T̄(d) that avoids negative chi-square values.

  8. Noncentral Chi-Square versus Normal Distributions in Describing the Likelihood Ratio Statistic: The Univariate Case and Its Multivariate Implication

    Science.gov (United States)

    Yuan, Ke-Hai

    2008-01-01

    In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…

  9. A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2010-04-01

    Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.

  10. Application of the modified chi-square ratio statistic in a stepwise procedure for cascade impactor equivalence testing.

    Science.gov (United States)

    Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2015-03-01

    Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.

  11. Personalised news filtering and recommendation system using Chi-square statistics-based K-nearest neighbour (χ2SB-KNN) model

    Science.gov (United States)

    Adeniyi, D. A.; Wei, Z.; Yang, Y.

    2017-10-01

    Recommendation problem has been extensively studied by researchers in the field of data mining, database and information retrieval. This study presents the design and realisation of an automated, personalised news recommendations system based on Chi-square statistics-based K-nearest neighbour (χ2SB-KNN) model. The proposed χ2SB-KNN model has the potential to overcome computational complexity and information overloading problems, reduces runtime and speeds up execution process through the use of critical value of χ2 distribution. The proposed recommendation engine can alleviate scalability challenges through combined online pattern discovery and pattern matching for real-time recommendations. This work also showcases the development of a novel method of feature selection referred to as Data Discretisation-Based feature selection method. This is used for selecting the best features for the proposed χ2SB-KNN algorithm at the preprocessing stage of the classification procedures. The implementation of the proposed χ2SB-KNN model is achieved through the use of a developed in-house Java program on an experimental website called OUC newsreaders' website. Finally, we compared the performance of our system with two baseline methods which are traditional Euclidean distance K-nearest neighbour and Naive Bayesian techniques. The result shows a significant improvement of our method over the baseline methods studied.

  12. Chi-square test and its application in hypothesis testing

    Directory of Open Access Journals (Sweden)

    Rakesh Rana

    2015-01-01

    Full Text Available In medical research, there are studies which often collect data on categorical variables that can be summarized as a series of counts. These counts are commonly arranged in a tabular format known as a contingency table. The chi-square test statistic can be used to evaluate whether there is an association between the rows and columns in a contingency table. More specifically, this statistic can be used to determine whether there is any difference between the study groups in the proportions of the risk factor of interest. Chi-square test and the logic of hypothesis testing were developed by Karl Pearson. This article describes in detail what is a chi-square test, on which type of data it is used, the assumptions associated with its application, how to manually calculate it and how to make use of an online calculator for calculating the Chi-square statistics and its associated P-value.

  13. Chi-squared goodness of fit tests with applications

    CERN Document Server

    Balakrishnan, N; Nikulin, MS

    2013-01-01

    Chi-Squared Goodness of Fit Tests with Applications provides a thorough and complete context for the theoretical basis and implementation of Pearson's monumental contribution and its wide applicability for chi-squared goodness of fit tests. The book is ideal for researchers and scientists conducting statistical analysis in processing of experimental data as well as to students and practitioners with a good mathematical background who use statistical methods. The historical context, especially Chapter 7, provides great insight into importance of this subject with an authoritative author team

  14. How-To-Do-It: Snails, Pill Bugs, Mealworms, and Chi-Square? Using Invertebrate Behavior to Illustrate Hypothesis Testing with Chi-Square.

    Science.gov (United States)

    Biermann, Carol

    1988-01-01

    Described is a study designed to introduce students to the behavior of common invertebrate animals, and to use of the chi-square statistical technique. Discusses activities with snails, pill bugs, and mealworms. Provides an abbreviated chi-square table and instructions for performing the experiments and statistical tests. (CW)

  15. Chi-squared: A simpler evaluation function for multiple-instance learning

    National Research Council Canada - National Science Library

    McGovern, Amy; Jensen, David

    2003-01-01

    ...) but finds the best concept using the chi-square statistic. This approach is simpler than diverse density and allows us to search more extensively by using properties of the contingency table to prune in a guaranteed manner...

  16. Chi-square tests for comparing weighted histograms

    International Nuclear Information System (INIS)

    Gagunashvili, N.D.

    2010-01-01

    Weighted histograms in Monte Carlo simulations are often used for the estimation of probability density functions. They are obtained as a result of random experiments with random events that have weights. In this paper, the bin contents of a weighted histogram are considered as a sum of random variables with a random number of terms. Generalizations of the classical chi-square test for comparing weighted histograms are proposed. Numerical examples illustrate an application of the tests for the histograms with different statistics of events and different weighted functions. The proposed tests can be used for the comparison of experimental data histograms with simulated data histograms as well as for the two simulated data histograms.

  17. Measures of effect size for chi-squared and likelihood-ratio goodness-of-fit tests.

    Science.gov (United States)

    Johnston, Janis E; Berry, Kenneth J; Mielke, Paul W

    2006-10-01

    A fundamental shift in editorial policy for psychological journals was initiated when the fourth edition of the Publication Manual of the American Psychological Association (1994) placed emphasis on reporting measures of effect size. This paper presents measures of effect size for the chi-squared and the likelihood-ratio goodness-of-fit statistic tests.

  18. Clarification of the use of chi-square and likelihood functions in fits to histograms

    International Nuclear Information System (INIS)

    Baker, S.; Cousins, R.D.

    1984-01-01

    We consider the problem of fitting curves to histograms in which the data obey multinomial or Poisson statistics. Techniques commonly used by physicists are examined in light of standard results found in the statistics literature. We review the relationship between multinomial and Poisson distributions, and clarify a sufficient condition for equality of the area under the fitted curve and the number of events on the histogram. Following the statisticians, we use the likelihood ratio test to construct a general Z 2 statistic, Zsub(lambda) 2 , which yields parameter and error estimates identical to those of the method of maximum likelihood. The Zsub(lambda) 2 statistic is further useful for testing goodness-of-fit since the value of its minimum asymptotically obeys a classical chi-square distribution. One should be aware, however, of the potential for statistical bias, especially when the number of events is small. (orig.)

  19. One implementation of the Chi Square Test with SPSS

    OpenAIRE

    Tinoco Gómez, Oscar

    2014-01-01

    Chi Cuadrado illustrates the use of the statistical software SPSS applied to the test to prove independence between two variables. The application carries out in the evaluation of the impact generated in the educational page of the Faculty of Administrative Sciences of the National University Federico Villarreal in relation to the use of some of the tools of the technologies called of information and communication in the process of formation profesional. Se ilustra el uso del software esta...

  20. Pearson's chi-square test and rank correlation inferences for clustered data.

    Science.gov (United States)

    Shih, Joanna H; Fay, Michael P

    2017-09-01

    Pearson's chi-square test has been widely used in testing for association between two categorical responses. Spearman rank correlation and Kendall's tau are often used for measuring and testing association between two continuous or ordered categorical responses. However, the established statistical properties of these tests are only valid when each pair of responses are independent, where each sampling unit has only one pair of responses. When each sampling unit consists of a cluster of paired responses, the assumption of independent pairs is violated. In this article, we apply the within-cluster resampling technique to U-statistics to form new tests and rank-based correlation estimators for possibly tied clustered data. We develop large sample properties of the new proposed tests and estimators and evaluate their performance by simulations. The proposed methods are applied to a data set collected from a PET/CT imaging study for illustration. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  1. Use of Pearson's Chi-Square for Testing Equality of Percentile Profiles across Multiple Populations.

    Science.gov (United States)

    Johnson, William D; Beyl, Robbie A; Burton, Jeffrey H; Johnson, Callie M; Romer, Jacob E; Zhang, Lei

    2015-08-01

    In large sample studies where distributions may be skewed and not readily transformed to symmetry, it may be of greater interest to compare different distributions in terms of percentiles rather than means. For example, it may be more informative to compare two or more populations with respect to their within population distributions by testing the hypothesis that their corresponding respective 10 th , 50 th , and 90 th percentiles are equal. As a generalization of the median test, the proposed test statistic is asymptotically distributed as Chi-square with degrees of freedom dependent upon the number of percentiles tested and constraints of the null hypothesis. Results from simulation studies are used to validate the nominal 0.05 significance level under the null hypothesis, and asymptotic power properties that are suitable for testing equality of percentile profiles against selected profile discrepancies for a variety of underlying distributions. A pragmatic example is provided to illustrate the comparison of the percentile profiles for four body mass index distributions.

  2. Reconnaissance On Chi-Square Test Procedure For Determining Two Species Association

    Science.gov (United States)

    Marisa, Hanifa

    2008-01-01

    Determining the assosiation of two species by using chi-square test has been published. Utility of this procedure to plants species at certain location, shows that the procedure could not find "ecologically" association. Tens sampling units have been made to record some weeds species in Indralaya, South Sumatera. Chi square test; Xt2 = N[|(ad)-(bc)|-(N/2)]2/mnrs (Eq:1) on two species (Cleome sp and Eleusine indica) of the weeds shows positive assosiation; while ecologically in nature, there is no relationship between them. Some alternatives are proposed to this problem; simplified chi-square test steps, make further study to find out ecologically association, or at last, ignore it.

  3. Calibration of Self-Efficacy for Conducting a Chi-Squared Test of Independence

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Goins, Deborah D.

    2015-01-01

    Self-efficacy and knowledge, both concerning the chi-squared test of independence, were examined in education graduate students. Participants rated statements concerning self-efficacy and completed a related knowledge assessment. After completing a demographic survey, participants completed the self-efficacy and knowledge scales a second time.…

  4. False star detection and isolation during star tracking based on improved chi-square tests.

    Science.gov (United States)

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Yang, Yanqiang; Su, Guohua

    2017-08-01

    The star sensor is a precise attitude measurement device for a spacecraft. Star tracking is the main and key working mode for a star sensor. However, during star tracking, false stars become an inevitable interference for star sensor applications, which may result in declined measurement accuracy. A false star detection and isolation algorithm in star tracking based on improved chi-square tests is proposed in this paper. Two estimations are established based on a Kalman filter and a priori information, respectively. The false star detection is operated through adopting the global state chi-square test in a Kalman filter. The false star isolation is achieved using a local state chi-square test. Semi-physical experiments under different trajectories with various false stars are designed for verification. Experiment results show that various false stars can be detected and isolated from navigation stars during star tracking, and the attitude measurement accuracy is hardly influenced by false stars. The proposed algorithm is proved to have an excellent performance in terms of speed, stability, and robustness.

  5. Double-Bottom Chaotic Map Particle Swarm Optimization Based on Chi-Square Test to Determine Gene-Gene Interactions

    Science.gov (United States)

    Yang, Cheng-Hong; Chang, Hsueh-Wei

    2014-01-01

    Gene-gene interaction studies focus on the investigation of the association between the single nucleotide polymorphisms (SNPs) of genes for disease susceptibility. Statistical methods are widely used to search for a good model of gene-gene interaction for disease analysis, and the previously determined models have successfully explained the effects between SNPs and diseases. However, the huge numbers of potential combinations of SNP genotypes limit the use of statistical methods for analysing high-order interaction, and finding an available high-order model of gene-gene interaction remains a challenge. In this study, an improved particle swarm optimization with double-bottom chaotic maps (DBM-PSO) was applied to assist statistical methods in the analysis of associated variations to disease susceptibility. A big data set was simulated using the published genotype frequencies of 26 SNPs amongst eight genes for breast cancer. Results showed that the proposed DBM-PSO successfully determined two- to six-order models of gene-gene interaction for the risk association with breast cancer (odds ratio > 1.0; P value <0.05). Analysis results supported that the proposed DBM-PSO can identify good models and provide higher chi-square values than conventional PSO. This study indicates that DBM-PSO is a robust and precise algorithm for determination of gene-gene interaction models for breast cancer. PMID:24895547

  6. Practical Statistics for Particle Physics Analyses: Chi-Squared and Goodness of Fit (2/4)

    CERN Multimedia

    CERN. Geneva; Moneta, Lorenzo

    2016-01-01

    This will be a 4-day series of 2-hour sessions as part of CERN's Academic Training Course. Each session will consist of a 1-hour lecture followed by one hour of practical computing, which will have exercises based on that day's lecture. While it is possible to follow just the lectures or just the computing exercises, we highly recommend that, because of the way this course is designed, participants come to both parts. In order to follow the hands-on exercises sessions, students need to bring their own laptops. The exercises will be run on a dedicated CERN Web notebook service, SWAN (swan.cern.ch), which is open to everybody holding a CERN computing account. The requirement to use the SWAN service is to have a CERN account and to have also access to Cernbox, the shared storage service at CERN. New users of cernbox are invited to activate beforehand cernbox by simply connecting to https://cernbox.cern.ch. A basic prior knowledge of ROOT and C++ is also recommended for participation in the practical session....

  7. Jet pairing algorithm for the 6-jet Higgs channel via energy chi-square criterion

    International Nuclear Information System (INIS)

    Magallanes, J.B.; Arogancia, D.C.; Gooc, H.C.; Vicente, I.C.M.; Bacala, A.M.; Miyamoto, A.; Fujii, K.

    2002-01-01

    Study and discovery of the Higgs bosons at JLC (Joint Linear Collider) is one of the tasks of ACFA (Asian Committee for future Accelerators)-JLC Group. The mode of Higgs production at JLC is e + e - → Z 0 H 0 . In this paper, studies are concentrated on the Higgsstrahlung process and the selection of its signals by getting the right jet-pairing algorithm of 6-jet final state at 300 GeV assuming that Higgs boson mass is 120 GeV and luminosity is 500 fb -1 . The total decay width Γ (H 0 → all) and the efficiency of the signals at the JLC are studied utilizing the 6-jet channel. Out of the 91,500 Higgsstrahlung events, 4,174 6-jet events are selected. PYTHIA Monte Carlo Generator generates the 6-jet Higgsstrahlung channel according to the Standard Model. The generated events are then simulated by Quick Simulator using the JCL parameters. After tagging all 6 quarks which correspond to the 6-jet final state of the Higgsstrahlung, the mean energy of the Z, H, and W's are obtained. Having calculated these information, the event energy chi-square is defined and it is found that the correct combination have generally smaller value. This criterion can be used to find correct jet-pairing algorithm and as one of the cuts for the background signals later on. Other chi-definitions are also proposed. (S. Funahashi)

  8. Association between litterers' profile and littering behavior: A chi-square approach

    Science.gov (United States)

    Asmui, Mas'udah; Zaki, Suhanom Mohd; Wahid, Sharifah Norhuda Syed; Mokhtar, Noorsuraya Mohd; Harith, Siti Suhaila

    2017-05-01

    Littering is not a novelty, yet a prolonged issue. The solutions have been discussed for a long time; however this issue still remains unresolved. Littering is commonly associated with littering behavior and awareness. The littering behavior is normally influenced by the litter profile such as gender, family income, education level and age. Jengka Street market, which is located in Pahang, is popularly known as a trade market. It offers diversities of wet and dry goods and is awaited by local residents and tourists. This study analyzes association between litterers' profile and littering behavior. Littering behavior is measured based on factors of trash bin facilities, awareness campaign and public littering behavior. 114 respondents were involved in this study with 62 (54.39%) are female aged more than 18 years old and majority of these female respondents are diploma holders. In addition, 78.95% of the respondents have family income below than RM3,000.00 per month. Based on the data analysis, it was found that first-time visitors littered higher than frequent visitors, lack of providing trash bin facilities contributes to positive littering behavior and there is a significant association between litterers' age and littering behavior by using chi-square approach.

  9. Intrusion detection model using fusion of chi-square feature selection and multi class SVM

    Directory of Open Access Journals (Sweden)

    Ikram Sumaiya Thaseen

    2017-10-01

    Full Text Available Intrusion detection is a promising area of research in the domain of security with the rapid development of internet in everyday life. Many intrusion detection systems (IDS employ a sole classifier algorithm for classifying network traffic as normal or abnormal. Due to the large amount of data, these sole classifier models fail to achieve a high attack detection rate with reduced false alarm rate. However by applying dimensionality reduction, data can be efficiently reduced to an optimal set of attributes without loss of information and then classified accurately using a multi class modeling technique for identifying the different network attacks. In this paper, we propose an intrusion detection model using chi-square feature selection and multi class support vector machine (SVM. A parameter tuning technique is adopted for optimization of Radial Basis Function kernel parameter namely gamma represented by ‘ϒ’ and over fitting constant ‘C’. These are the two important parameters required for the SVM model. The main idea behind this model is to construct a multi class SVM which has not been adopted for IDS so far to decrease the training and testing time and increase the individual classification accuracy of the network attacks. The investigational results on NSL-KDD dataset which is an enhanced version of KDDCup 1999 dataset shows that our proposed approach results in a better detection rate and reduced false alarm rate. An experimentation on the computational time required for training and testing is also carried out for usage in time critical applications.

  10. Chi-Square Test of Word of Mouth Marketing with Impact on the Evaluation of Patients' Hospital and Services: An Application in Teaching and Research Hospital

    Directory of Open Access Journals (Sweden)

    Yelda ŞENER

    2014-12-01

    Full Text Available The purpose of this study, using data provided from 223 inpatients in a teaching and research hospital, hospital’s preference is to explain the effect of word of mouth marketing. For this purpose, word of mouth marketing process is evaluated in terms of providing information about the hospital and the patient’s level of intimacy, both of patients and information provider’s level of expertise with related to hospital and services, the patient’s perceived level of risk for hospitals and services and providing information’s level of impact on patient being treated in hospital. The obtain data, after evaluation by frequency distributions these factors impact on word of mouth marketing is demonstrated by descriptive statistics, chi-square analysis and pearson’s correlation analysis. As a result of this study is concluded word of mouth marketing on the training and research hospital is preferred by the patints to have a significant impact.

  11. Landslide susceptibility mapping using decision-tree based CHi-squared automatic interaction detection (CHAID) and Logistic regression (LR) integration

    International Nuclear Information System (INIS)

    Althuwaynee, Omar F; Pradhan, Biswajeet; Ahmad, Noordin

    2014-01-01

    This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies

  12. Landslide susceptibility mapping using decision-tree based CHi-squared automatic interaction detection (CHAID) and Logistic regression (LR) integration

    Science.gov (United States)

    Althuwaynee, Omar F.; Pradhan, Biswajeet; Ahmad, Noordin

    2014-06-01

    This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies.

  13. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China.

    Science.gov (United States)

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-05-20

    In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.

  14. GammaCHI: a package for the inversion and computation of the gamma and chi-square cumulative distribution functions (central and noncentral)

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2015-01-01

    textabstractA Fortran 90 module GammaCHI for computing and inverting the gamma and chi-square cumulative distribution functions (central and noncentral) is presented. The main novelty of this package is the reliable and accurate inversion routines for the noncentral cumulative distribution

  15. A prediction rule for the development of delirium among patients in medical wards: Chi-Square Automatic Interaction Detector (CHAID) decision tree analysis model.

    Science.gov (United States)

    Kobayashi, Daiki; Takahashi, Osamu; Arioka, Hiroko; Koga, Shinichiro; Fukui, Tsuguya

    2013-10-01

    To predict development of delirium among patients in medical wards by a Chi-Square Automatic Interaction Detector (CHAID) decision tree model. This was a retrospective cohort study of all adult patients admitted to medical wards at a large community hospital. The subject patients were randomly assigned to either a derivation or validation group (2:1) by computed random number generation. Baseline data and clinically relevant factors were collected from the electronic chart. Primary outcome was the development of delirium during hospitalization. All potential predictors were included in a forward stepwise logistic regression model. CHAID decision tree analysis was also performed to make another prediction model with the same group of patients. Receiver operating characteristic curves were drawn, and the area under the curves (AUCs) were calculated for both models. In the validation group, these receiver operating characteristic curves and AUCs were calculated based on the rules from derivation. A total of 3,570 patients were admitted: 2,400 patients assigned to the derivation group and 1,170 to the validation group. A total of 91 and 51 patients, respectively, developed delirium. Statistically significant predictors were delirium history, age, underlying malignancy, and activities of daily living impairment in CHAID decision tree model, resulting in six distinctive groups by the level of risk. AUC was 0.82 in derivation and 0.82 in validation with CHAID model and 0.78 in derivation and 0.79 in validation with logistic model. We propose a validated CHAID decision tree prediction model to predict the development of delirium among medical patients. Copyright © 2013 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  16. A Search for WIMP Dark Matter Using an Optimized Chi-square Technique on the Final Data from the Cryogenic Dark Matter Search Experiment (CDMS II)

    Energy Technology Data Exchange (ETDEWEB)

    Manungu Kiveni, Joseph [Syracuse Univ., NY (United States)

    2012-12-01

    This dissertation describes the results of a WIMP search using CDMS II data sets accumulated at the Soudan Underground Laboratory in Minnesota. Results from the original analysis of these data were published in 2009; two events were observed in the signal region with an expected leakage of 0.9 events. Further investigation revealed an issue with the ionization-pulse reconstruction algorithm leading to a software upgrade and a subsequent reanalysis of the data. As part of the reanalysis, I performed an advanced discrimination technique to better distinguish (potential) signal events from backgrounds using a 5-dimensional chi-square method. This dataanalysis technique combines the event information recorded for each WIMP-search event to derive a backgrounddiscrimination parameter capable of reducing the expected background to less than one event, while maintaining high efficiency for signal events. Furthermore, optimizing the cut positions of this 5-dimensional chi-square parameter for the 14 viable germanium detectors yields an improved expected sensitivity to WIMP interactions relative to previous CDMS results. This dissertation describes my improved (and optimized) discrimination technique and the results obtained from a blind application to the reanalyzed CDMS II WIMP-search data.

  17. Titanic: A Statistical Exploration.

    Science.gov (United States)

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  18. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    CERN Document Server

    Hawkins, Donovan Lee

    2005-01-01

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  19. Principal components in the discrimination of outliers: A study in simulation sample data corrected by Pearson's and Yates´s chi-square distance

    Directory of Open Access Journals (Sweden)

    Manoel Vitor de Souza Veloso

    2016-04-01

    Full Text Available Current study employs Monte Carlo simulation in the building of a significance test to indicate the principal components that best discriminate against outliers. Different sample sizes were generated by multivariate normal distribution with different numbers of variables and correlation structures. Corrections by chi-square distance of Pearson´s and Yates's were provided for each sample size. Pearson´s correlation test showed the best performance. By increasing the number of variables, significance probabilities in favor of hypothesis H0 were reduced. So that the proposed method could be illustrated, a multivariate time series was applied with regard to sales volume rates in the state of Minas Gerais, obtained in different market segments.

  20. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    Science.gov (United States)

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  1. Leeds under a cloud

    International Nuclear Information System (INIS)

    1988-01-01

    Before 26 April 1986 few people in the west had heard of Chernobyl. Then Chernobyl experienced the world's worst nuclear power station accident. In the wake of the disaster radioactivity fell on Britain and much of Europe. There was confusion and rumour on the television, in the papers and amongst ordinary people. What would the effect of the Chernobyl accident be? Was it safe to go out of doors? Was it safe to eat fresh vegetables? What was a safe level of radiation? What was a becquerel, a milliSievert or any of the other scientific terms with which we were bombarded by scientists and other experts? This booklet sets out to help answer these questions by looking at a hypothetical disaster at the nuclear power station at Heysham, near Morecambe in Lancashire. Using this scenario it shows what the worst consequences of a nuclear accident might be for the citizens of Leeds. It also explains in a straightforward way the meaning of many technical terms which will help you to understand the advice and comments of experts and to make your own judgement of what they say. (author)

  2. Figure-of-merit (FOM), an improved criterion over the normalized chi-squared test for assessing goodness-of-fit of gamma-ray spectral peaks

    International Nuclear Information System (INIS)

    Garo Balian, H.; Eddy, N.W.

    1977-01-01

    A careful experimenter knows that in order to choose the best curve fits of peaks from a gamma ray spectrum for such purposes as energy or intensity calibration, half-life determination, etc., the application of the normalized chi-squared test, [chisub(N)] 2 =chi 2 /(n-m), is insufficient. One must normally verify the goodness-of-fit with plots, detailed scans of residuals, etc. Because of different techniques of application, variations in backgrounds, in peak sizes and shapes, etc., quotation of the [chisub(N)] 2 value associated with an individual peak fit conveys very little information unless accompanied by considerable ancillary data. (This is not to say that the traditional chi 2 formula should not be used as the source of the normal equations in the least squares fitting procedure. But after the fitting, it is unreliable as a criterion for comparison with other fits.) The authors present a formula designated figure-of-merit (FOM) which greatly improves on the uncertainty and fluctuations of the [chisub(N)] 2 formula. An FOM value of less than 2.5% indicates a good fit (in the authors' judgement) irrespective of background conditions and variations in peak sizes and shapes. Furthermore, the authors feel the FOM formula is less subject to fluctuations resulting from different techniques of application. (Auth.)

  3. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    International Nuclear Information System (INIS)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica reg-sign computer programs which are provided

  4. Quantitative analysis of urban sprawl in Tripoli using Pearson's Chi-Square statistics and urban expansion intensity index

    International Nuclear Information System (INIS)

    Al-sharif, Abubakr A A; Pradhan, Biswajeet; Shafri, Helmi Zulhaidi Mohd; Mansor, Shattri

    2014-01-01

    Urban expansion is a spatial phenomenon that reflects the increased level of importance of metropolises. The remotely sensed data and GIS have been widely used to study and analyze the process of urban expansions and their patterns. The capital of Libya (Tripoli) was selected to perform this study and to examine its urban growth patterns. Four satellite imageries of the study area in different dates (1984, 1996, 2002 and 2010) were used to conduct this research. The main goal of this work is identification and analyzes the urban sprawl of Tripoli metropolitan area. Urban expansion intensity index (UEII) and degree of freedom test were used to analyze and assess urban expansions in the area of study. The results show that Tripoli has sprawled urban expansion patterns; high urban expansion intensity index; and its urban development had high degree of freedom according to its urban expansion history during the time period (1984-2010). However, the novel proposed hypothesis used for zones division resulted in very good insight understanding of urban expansion direction and the effect of the distance from central business of district (CBD)

  5. Model selection for contingency tables with algebraic statistics

    NARCIS (Netherlands)

    Krampe, A.; Kuhnt, S.; Gibilisco, P.; Riccimagno, E.; Rogantin, M.P.; Wynn, H.P.

    2009-01-01

    Goodness-of-fit tests based on chi-square approximations are commonly used in the analysis of contingency tables. Results from algebraic statistics combined with MCMC methods provide alternatives to the chi-square approximation. However, within a model selection procedure usually a large number of

  6. LEED (Low Energy Electron Diffraction)

    International Nuclear Information System (INIS)

    Aberdam, M.

    1973-01-01

    The various types of systems studied by LEED, and for which the geometry of diffraction patterns is exploited, are reviewed, intensity profiles being another source of information. Two representative approaches of the scattering phenomenon are examined; the band structure theory and the T matrix approach [fr

  7. A goodness of fit statistic for the geometric distribution

    OpenAIRE

    Ferreira, J.A.

    2003-01-01

    textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results suggest that the test based on the new statistic is generally superior to the chi-square test.

  8. Modelo de detección de intrusiones en sistemas computacionales, realizando selección de características con chi square, entrenamiento y clasificación con ghsom

    Directory of Open Access Journals (Sweden)

    Johan Mardini

    2017-07-01

    Full Text Available Dado que la información se ha constituido en uno de los activos más valiosos de las organizaciones, es necesario salvaguardarla a través de diferentes estrategias de protección, con el fin de evitar accesos intrusivos o cualquier tipo de incidente que cause el deterioro y mal uso de la misma. Precisamente por ello, en este artículo se evalúa la eficiencia de un modelo de detección de intrusiones de red, utilizando métricas de sensibilidad, especificidad, precisión y exactitud, mediante un proceso de simulación que utiliza el DATASET NSL-KDD DARPA, y en concreto las características más relevantes con CHI SQUARE. Esto último a partir de una red neuronal que hace uso de un algoritmo de aprendizaje no supervisado y que se basa en mapas auto organizativos jerárquicos. Con todo ello se clasificó el tráfico de la red BI-CLASE de forma automática. Como resultado se encontró que el clasificador GHSOM utilizado con la técnica CHI SQUARE genera su mejor resultado a 15 características con precisión, sensibilidad, especificidad y exactitud

  9. Advances on surface structural determination by LEED

    International Nuclear Information System (INIS)

    Soares, Edmar A; De Carvalho, Vagner E; De Castilho, Caio M C

    2011-01-01

    In the last 40 years, low energy electron diffraction (LEED) has proved to be the most reliable quantitative technique for surface structural determination. In this review, recent developments related to the theory that gives support to LEED structural determination are discussed under a critical analysis of the main theoretical approximation-the muffin-tin calculation. The search methodologies aimed at identifying the best matches between theoretical and experimental intensity versus voltage curves are also considered, with the most recent procedures being reviewed in detail. (topical review)

  10. Federal Participation in LEED in 2005

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Christopher; Dyer, Beverly

    2005-11-01

    The federal government is an active participant in promotingsustainable design, construction and operations and in the use of USGBC'sLeadership in Energy and Environmental Design (LEED) Green BuildingRating System. This paper presents an overview of sustainableconstruction activities in the federal sector in 2005.

  11. A goodness of fit statistic for the geometric distribution

    NARCIS (Netherlands)

    J.A. Ferreira

    2003-01-01

    textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results

  12. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  13. Reliability of contemporary data-acquisition techniques for LEED analysis

    International Nuclear Information System (INIS)

    Noonan, J.R.; Davis, H.L.

    1980-10-01

    It is becoming clear that one of the principal limitations in LEED structure analysis is the quality of the experimental I-V profiles. This limitation is discussed, and data acquisition procedures described, which for simple systems, seem to enhance the quality of agreement between the results of theoretical model calculations and experimental LEED spectra. By employing such procedures to obtain data from Cu(100), excellent agreement between computed and measured profiles has been achieved. 7 figures

  14. Residents’ Support in Major Local Events: Leeds Pride.

    OpenAIRE

    Pappas, Nikolaos

    2016-01-01

    This article examines the extent to which community participation and perceived impacts have an influence on residents' support of major events, more specifically, the Leeds Pride celebration. The research examines the perspectives of 400 Leeds permanent residents. The study tests a structural equation model, which has its theoretical basis in social exchange theory. It examines the constructs of community participation, perceived positive and negative impacts, and community support, includin...

  15. Cost analysis of LEED certified United States navy buildings

    OpenAIRE

    Kirar, Carl V.

    2011-01-01

    CIVINS (Civilian Institutions) Thesis document A study was completed at UW-Madison in 2010 that reviewed the energy consumption of US Navy buildings which earned Leadership in Energy and Environmental Design (LEED) certification by the United States Green Building Council (USGBC). The research compared LEED certified buildings to a commercial counterpart within the US Navy inventory against Executive Order (EO) 13423. The EO mandated that all federal agencies meet a 30 percent reduction of...

  16. Advanced Categorical Statistics: Issues and Applications in Communication Research.

    Science.gov (United States)

    Denham, Bryan E.

    2002-01-01

    Discusses not only the procedures, assumptions, and applications of advanced categorical statistics, but also covers some common misapplications, from which a great deal can be learned. Addresses the use and limitations of cross-tabulation and chi-square analysis, as well as issues such as observation independence and artificial inflation of a…

  17. Practical statistics for particle physicists

    CERN Multimedia

    CERN. Geneva

    2006-01-01

    Learning to love the errror matrix lecture : Learning to love the errror matrix Introductory remarks. Conditional probability. Statistical and systematic errors. Combining results Binomial, Poisson and 1-D Gaussian 2-D Gaussian and the error matrix. Understanding the covariance. Using the error matrix. Estimating the error matrix. Combining correlated measurements Parameter determination by likelihood Do's and don'ts lecture : Parameter determination by likelihood : Do's and don'ts Introduction to likelihood. Error estimate. Simple examples: (1) Breit Wigner (2) Lifetime Binned and unbinned likelihood Several parameters Extended maximum likelihood. Common misapprehensions: Normalisation delta(lnL) = 1/2 rule and coverage Integrating the likelihood Unbinned L_max as goodness of fit Punzi effect Chi-squared and hypothesis testing lecture : Chi-squared and hypothesis testing Basic idea. Error estimates. Several parameters Correlated errors on y. Errors on x and y. Goodness of fit. Degrees of freedom. Why assympt...

  18. Statistics II essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se

  19. Designing healthy communities: A walkability analysis of LEED-ND

    Directory of Open Access Journals (Sweden)

    Adriana A. Zuniga-Teran

    2016-12-01

    Full Text Available Prevailing city design in many countries has created sedentary societies that depend on automobile use. Consequently, architects, urban designers, and land planners have developed new urban design theories, which have been incorporated into the Leadership in Energy and Environmental Design for Neighborhood Development (LEED-ND certification system. The LEED-ND includes design elements that improve human well-being by facilitating walking and biking, a concept known as walkability. Despite these positive developments, relevant research findings from other fields of study have not been fully integrated into the LEED-ND. According to Zuniga-Teran (2015, relevant walkability research findings from multiple disciplines were organized into a walkability framework (WF that organizes design elements related to physical activity into nine categories, namely, connectivity, land use, density, traffic safety, surveillance, parking, experience, greenspace, and community. In this study, we analyze walkability in the LEED-ND through the lens of the nine WF categories. Through quantitative and qualitative analyses, we identify gaps and strengths in the LEED-ND and propose potential enhancements to this certification system that reflects what is known about enhancing walkability more comprehensively through neighborhood design analysis. This work seeks to facilitate the translation of research into practice, which can ultimately lead to more active and healthier societies.

  20. Extending the range of low energy electron diffraction (LEED) surface structure determination: Co-adsorbed molecules, incommensurate overlayers and alloy surface order studied by new video and electron counting LEED techniques

    International Nuclear Information System (INIS)

    Ogletree, D.F.

    1986-11-01

    LEED multiple scattering theory is briefly summarized, and aspects of electron scattering with particular significance to experimental measurements such as electron beam coherence, instrument response and phonon scattering are analyzed. Diffuse LEED experiments are discussed. New techniques that enhance the power of LEED are described, including a real-time video image digitizer applied to LEED intensity measurements, along with computer programs to generate I-V curves. The first electron counting LEED detector using a ''wedge and strip'' position sensitive anode and digital electronics is described. This instrument uses picoampere incident beam currents, and its sensitivity is limited only by statistics and counting times. Structural results on new classes of surface systems are presented. The structure of the c(4 x 2) phase of carbon monoxide adsorbed on Pt(111) has been determined, showing that carbon monoxide molecules adsorb in both top and bridge sites, 1.85 +- 0.10 A and 1.55 +- 0.10 A above the metal surface, respectively. The structure of an incommensurate graphite overlayer on Pt(111) is analyzed. The graphite layer is 3.70 +- 0.05 A above the metal surface, with intercalated carbon atoms located 1.25 +- 0.10 A above hollow sites supporting it. The (2√3 x 4)-rectangular phase of benzene and carbon monoxide coadsorbed on Pt(111) is analyzed. Benzene molecules adsorb in bridge sites parallel to and 2.10 +- 0.10 A above the surface. The carbon ring is expanded, with an average C-C bond length of 1.72 +- 0.15 A. The carbon monoxide molecules also adsorb in bridge sites. The structure of the (√3 x √3) reconstruction on the (111) face of the α-CuAl alloy has been determined

  1. Extending the range of low energy electron diffraction (LEED) surface structure determination: Co-adsorbed molecules, incommensurate overlayers and alloy surface order studied by new video and electron counting LEED techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ogletree, D.F.

    1986-11-01

    LEED multiple scattering theory is briefly summarized, and aspects of electron scattering with particular significance to experimental measurements such as electron beam coherence, instrument response and phonon scattering are analyzed. Diffuse LEED experiments are discussed. New techniques that enhance the power of LEED are described, including a real-time video image digitizer applied to LEED intensity measurements, along with computer programs to generate I-V curves. The first electron counting LEED detector using a ''wedge and strip'' position sensitive anode and digital electronics is described. This instrument uses picoampere incident beam currents, and its sensitivity is limited only by statistics and counting times. Structural results on new classes of surface systems are presented. The structure of the c(4 x 2) phase of carbon monoxide adsorbed on Pt(111) has been determined, showing that carbon monoxide molecules adsorb in both top and bridge sites, 1.85 +- 0.10 A and 1.55 +- 0.10 A above the metal surface, respectively. The structure of an incommensurate graphite overlayer on Pt(111) is analyzed. The graphite layer is 3.70 +- 0.05 A above the metal surface, with intercalated carbon atoms located 1.25 +- 0.10 A above hollow sites supporting it. The (2..sqrt..3 x 4)-rectangular phase of benzene and carbon monoxide coadsorbed on Pt(111) is analyzed. Benzene molecules adsorb in bridge sites parallel to and 2.10 +- 0.10 A above the surface. The carbon ring is expanded, with an average C-C bond length of 1.72 +- 0.15 A. The carbon monoxide molecules also adsorb in bridge sites. The structure of the (..sqrt..3 x ..sqrt..3) reconstruction on the (111) face of the ..cap alpha..-CuAl alloy has been determined.

  2. Performance or marketing benefits? The case of LEED certification.

    Science.gov (United States)

    Matisoff, Daniel C; Noonan, Douglas S; Mazzolini, Anna M

    2014-01-01

    Green building adoption is driven by both performance-based benefits and marketing based benefits. Performance based benefits are those that improve performance or lower operating costs of the building or of building users. Marketing benefits stem from the consumer response to green certification. This study illustrates the relative importance of the marketing based benefits that accrue to Leadership in Energy and Environmental Design (LEED) buildings due to green signaling mechanisms, specifically related to the certification itself are identified. Of course, all participants in the LEED certification scheme seek marketing benefits. But even among LEED participants, the interest in green signaling is pronounced. The green signaling mechanism that occurs at the certification thresholds shifts building patterns from just below to just above the threshold level, and motivates builders to cluster buildings just above each threshold. Results are consistent across subsamples, though nonprofit organizations appear to build greener buildings and engage in more green signaling than for-profit entities. Using nonparametric regression discontinuity, signaling across different building types is observed. Marketing benefits due to LEED certification drives organizations to build "greener" buildings by upgrading buildings at the thresholds to reach certification levels.

  3. A comparison of three methods of assessing differential item functioning (DIF) in the Hospital Anxiety Depression Scale: ordinal logistic regression, Rasch analysis and the Mantel chi-square procedure.

    Science.gov (United States)

    Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C

    2014-12-01

    It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.

  4. Critical review of LEED system for rating sustainability of architecture of commercial interiors

    Directory of Open Access Journals (Sweden)

    Stevanović Sanja

    2010-01-01

    Full Text Available The LEED rating system for sustainability of architecture has gained large marketing potential in USA and became one of main ways American builders are attacking ecological challenges. In this paper the LEED rating system for commercial interiors is critically reviewed, pointing out its positive - focus on integrated design process - and negative impacts - low thresholds for highest ratings and tendency to gain LEED rating with projects that hardly pass the thresholds, largely neglecting the principles of energy efficiency. Based on a few prominent LEED platinum examples, the beginnings of a LEED style of designing interiors in historical landmark buildings are pointed out as well.

  5. Residuals and the Residual-Based Statistic for Testing Goodness of Fit of Structural Equation Models

    Science.gov (United States)

    Foldnes, Njal; Foss, Tron; Olsson, Ulf Henning

    2012-01-01

    The residuals obtained from fitting a structural equation model are crucial ingredients in obtaining chi-square goodness-of-fit statistics for the model. The authors present a didactic discussion of the residuals, obtaining a geometrical interpretation by recognizing the residuals as the result of oblique projections. This sheds light on the…

  6. An Item Fit Statistic Based on Pseudocounts from the Generalized Graded Unfolding Model: A Preliminary Report.

    Science.gov (United States)

    Roberts, James S.

    Stone and colleagues (C. Stone, R. Ankenman, S. Lane, and M. Liu, 1993; C. Stone, R. Mislevy and J. Mazzeo, 1994; C. Stone, 2000) have proposed a fit index that explicitly accounts for the measurement error inherent in an estimated theta value, here called chi squared superscript 2, subscript i*. The elements of this statistic are natural…

  7. Pivotal statistics for testing subsets of structural parameters in the IV Regression Model

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2000-01-01

    We construct a novel statistic to test hypothezes on subsets of the structural parameters in anInstrumental Variables (IV) regression model. We derive the chi squared limiting distribution of thestatistic and show that it has a degrees of freedom parameter that is equal to the number ofstructural

  8. Quantum chi-squared and goodness of fit testing

    Energy Technology Data Exchange (ETDEWEB)

    Temme, Kristan [IQIM, California Institute of Technology, Pasadena, California 91125 (United States); Verstraete, Frank [Fakultät für Physik, Universität Wien, Boltzmanngasse 5, 1090 Wien, Austria and Faculty of Science, Ghent University, B-9000 Ghent (Belgium)

    2015-01-15

    A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fit test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.

  9. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  10. Achieving LEED credit for ergonomics: Laying the foundation.

    Science.gov (United States)

    Lynch, Mallory

    2014-01-01

    Despite guidance from the United States Green Building Council (USGBC) on the requirements for earning a Leadership in Energy and Environmental Design (LEED) ergonomics credit in the Innovation in Design and Innovation in Operations category, few projects have received the credit. The University of California, Berkeley ergonomics program, Ergonomics@Work, has aligned the ergonomics strategy to those of the USGBC and LEED to achieve the ergonomics credit in several new buildings. This article describes the steps needed to obtain the credit and highlights the opportunities it creates to partner with the project team to promote ergonomics. As a profession it is up to ergonomists to create the road map that incorporates ergonomics into the green building design.

  11. Green roofs and the LEED green building rating system

    Energy Technology Data Exchange (ETDEWEB)

    Kula, R. [Sustainable Solutions Inc., Wagoner, OK (United States)

    2005-07-01

    The sustainable building industry is becoming increasingly aware of the host of public and private benefits that green roofs can provide in built environments. In dense urban environments, green roofs function to reduce stormwater runoff, urban heat island effects, and particulate matter (PM) pollution. The emerging green roof industry is now poised to support the efforts of green building networks in North America. This paper discussed the general benefits of green roofs, and their recognition within the Leadership in Energy and Environmental Design (LEED) Green Building Rating System. A case study of Mountain Equipment Co-op's Winnipeg site was presented. The building's green roof was directly responsible for earning 5 credits and contributing to the achievement of an additional 2 credits under the LEEDS certification process. Credits were earned for reduced site disturbance; landscape design to reduce heat islands; and water efficiency. The green roof at the site provided the vast majority of the building's cooling needs through an evaporative cooling trough. A photovoltaic pump was used to feed the building's irrigation system, as well as to pump ground water through cooling valances. It was concluded that the rise of sustainable building practices and the LEED Green Building Rating System will revolutionize the way new buildings are constructed.

  12. Validation of the Malayalam version of Leeds assessment of neuropathic symptoms and signs pain scale in cancer patients in the Regional Cancer Centre, Thiruvananthapuram, Kerala, India

    Directory of Open Access Journals (Sweden)

    Shoukkathali Anzar

    2017-01-01

    Full Text Available Objective: The Self-administered Leeds Assessment of Neuropathic Symptoms and Signs (S-LANSS is a 7-item self-report scale developed to identify pain which is of predominantly neuropathic origin. The aim of this study was to develop a Malayalam version of the LANSS and to test its validity and reliability in chronic pain patients. Methodology: We enrolled 101 Malayalam-speaking chronic pain patients who visited the Division of Palliative Medicine, Regional Cancer Centre, Thiruvananthapuram, Kerala, India. The translated version of S- LANSS was constructed by standard means. Fifty-one neuropathic pain and fifty nociceptive pain patients were identified by an independent pain physician and were subjected to the new pain scale by a palliative care nurse who was blinded to the diagnosis. The “gold standard diagnosis” is what the physician makes after clinical examination. Its validation, sensitivity, specificity, and positive and negative predictive values were determined. Results: Fifty-one neuropathic pain and fifty nociceptive pain patients were subjected to the Malayalam version of S-LANSS pain scale for validity testing. The agreement by Cohen's Kappa 0.743, Chi-square test P < 0.001, sensitivity 89.58, specificity 84.91, positive predictive value 84.31, negative predictive value 90.00, accuracy by 87.13, and likelihood ratio 5.94. Conclusion: The Malayalam version of S-LANSS pain scale is a validated screening tool for identifying neuropathic pain in chronic pain patients in Malayalam-speaking regions.

  13. Application of a LEED apparatus provided with a lens to the study of vicinal surfaces

    International Nuclear Information System (INIS)

    Laydevant, Louis; Dupuy, J.C.

    1979-01-01

    Steps presence on vicinal surfaces changes the low energy electron difraction (LEED) pattern: a system of regulary spaced steps is causing some spots to be splitted. Using a high voltage LEED apparatus allows an easy explanation of the patterns: the spot position does not depend about energy and so some cristallographic parameters can be easily measured [fr

  14. Introducing comparative analysis to the LEED system: A case forrational and regional application

    Energy Technology Data Exchange (ETDEWEB)

    Eijadi, David; Vaidya, Prausad; Reinertsen, James; Kumar, Satish

    2002-06-01

    The LEED(TM) system awards points for prescriptive andperformance based environmental strategies; rightly giving more weight todecisions affecting building operations, since environmental impacts overthe life of a building exceed the one-time environmental impacts affectedby the building s construction. The environmental benefits of LEED(TM)strategies are considered implicit and the point system is not a metricof environmental performance. Thus, guideline strategies that achieve thesame points may not have analogous environmental performance. This paperdraws from our LEED(TM) project experience as certified consultants to anumber of design teams. We applied analysis to those experiences andargue that -The relative environmental value of the same LEED(TM)strategy may vary by geographical region and by building type. -Scoringsuccessive LEED(TM) points beyond a 'standard practice design'significantly increases design effort and capital costs for construction.-Without comparative analysis of the costs of alternate LEED(TM)strategies and their corresponding environmental benefit, designers willnot necessarily invest capital in strategies that most profoundlyminimize the environmental impacts of a building. -For design teams andowners interested in the least expensive LEED(TM) certification, gamingthe point system could drive investment away from sound environmentalperformance strategies such as energy efficiency. Using these arguments,this paper makes a case to enhance the LEED(TM) system by -CategorizingLEED(TM) strategies by their direct or indirect value towardsEnvironmental Benefit, Healthy Buildings (Places), and Profitability.-Reformulating prescriptive requirements into performance basedrequirements wherever possible. -Customizing LEED(TM) guidelines byregion.

  15. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  16. A statistically self-consistent type Ia supernova data analysis

    International Nuclear Information System (INIS)

    Lago, B.L.; Calvao, M.O.; Joras, S.E.; Reis, R.R.R.; Waga, I.; Giostri, R.

    2011-01-01

    Full text: The type Ia supernovae are one of the main cosmological probes nowadays and are used as standardized candles in distance measurements. The standardization processes, among which SALT2 and MLCS2k2 are the most used ones, are based on empirical relations and leave room for a residual dispersion in the light curves of the supernovae. This dispersion is introduced in the chi squared used to fit the parameters of the model in the expression for the variance of the data, as an attempt to quantify our ignorance in modeling the supernovae properly. The procedure used to assign a value to this dispersion is statistically inconsistent and excludes the possibility of comparing different cosmological models. In addition, the SALT2 light curve fitter introduces parameters on the model for the variance that are also used in the model for the data. In the chi squared statistics context the minimization of such a quantity yields, in the best case scenario, a bias. An iterative method has been developed in order to perform the minimization of this chi squared but it is not well grounded, although it is used by several groups. We propose an analysis of the type Ia supernovae data that is based on the likelihood itself and makes it possible to address both inconsistencies mentioned above in a straightforward way. (author)

  17. Radiofrequency electromagnetic fields in the Cookridge area of Leeds

    CERN Document Server

    Fuller, K; Judd, P M; Lowe, A J; Shaw, J

    2002-01-01

    On the 8 and 9 May 2002 representatives of the National Radiological Protection Board (NRPB) performed a radiofrequency electromagnetic field survey in the Cookridge area of Leeds in order to assess exposure to radio signals from transmitters mounted on a water tower/a lattice tower and a radio station tower. Guidelines on limiting exposure to radio signals have been published by NRPB and the International Commission on Non-Ionizing Radiation Protection (ICNIRP). These guidelines are designed to prevent established adverse effects on human health. During this survey, the total exposures due to all radio signals from 30 MHz to 18000 MHz (18 GHz) were measured. This frequency range was chosen as it includes mobile phone base station transmissions, which are at around 900 and 1800 MHz and super high frequency (SHF) transmissions from most of the large microwave dish antennas mounted on the towers. In addition, other major sources of radiofrequency electromagnetic fields in the environment such as broadcast radio...

  18. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  19. Impact of different LEED versions for green building certification and energy efficiency rating system: A Multifamily Midrise case study

    International Nuclear Information System (INIS)

    Rastogi, Ankush; Choi, Jun-Ki; Hong, Taehoon; Lee, Minhyun

    2017-01-01

    Highlights: •Energy consumption change from applying different LEED versions were investigated. •Four analysis scenarios were compared using different versions of ASHRAE Standard. •A case study of a mid-rise multi-family building was conducted using energy simulation. •Residential buildings could benefit from LEED v4 due to the low prerequisite. •Renovation buildings are highly incentivized regardless of LEED version used. -- Abstract: Various versions of the Leadership in Energy and Environmental Design (LEED ® ) have been introduced with the addition of more stringent sustainability parameters and credit scoring schemes over the past decade. Such changes in LEED versions strongly affect the energy performance and LEED scores of the target building in the LEED certification process. Therefore, to validate and improve the current LEED version, it is crucial to investigate and compare the impact of different LEED versions on the building energy performance and scoring scheme. However, researches comparing the sustainability metrics for mid-rise multi-family buildings are rare. Therefore, this paper investigates the potential changes in the energy performance resulted from applying different LEED versions (i.e., LEED v3 and v4) for the Energy and Atmosphere (EA) category. Towards this end, a case study was carried out with energy modeling and simulation using TRACE 700 to compare the changes in the energy performance of four analysis scenarios applied to an existing mid-rise multi-family building located in Ohio. Results showed notable changes in LEED points when different versions of LEED using different ASHRAE Standards (i.e., ASHRAE Standards 90.1-2007 and 90.1-2010) are applied for the building energy analysis. In particular, mid-rise multi-family buildings could benefit from LEED v4 in terms of LEED credits as the prerequisite for the minimum energy performance improvement in EA category became significantly lenient compared to LEED v3. On the

  20. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  1. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  4. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    Science.gov (United States)

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  5. Lies, Damned Lies, and Statistics (in Geology)

    Science.gov (United States)

    Vermeesch, Pieter

    2009-11-01

    According to Karl Popper's epistemology of critical rationalism, scientists should formulate falsifiable hypotheses rather than produce ad hoc answers to empirical observations. In other words, we should predict and test rather than merely explain [Popper, 1959]. Sometimes, statistical tests such as chi-square, t, or Kolmogorov-Smirnov are used to make deductions more “objective.” Such tests are used in a wide range of geological subdisciplines [see Reimann and Filzmoser, 2000; Anderson and Johnson, 1999; Lørup et al., 1998; Sircombe and Hazelton, 2004].

  6. Universidad de Leeds - Gran Bretaña

    Directory of Open Access Journals (Sweden)

    Chamberlin, -

    1977-11-01

    Full Text Available Located 1,500 m from the center of the city, the Leeds University complex actively participates in city life. Designed in the 60's and built later on, this architectonic complex is outstanding because it offers an «ideal» city, perfectly integrated in the «real» city and conditioned to its own needs, to a great extent. In the beginning, this challenge of converting this university complex with a capacity for 10,000 students, in an architectonically attractive urban center met with difficulties referring to traffic and parking problems corresponding to a city as large as the one projected; this obstacle was overcome by adequate organization of underground and overhead traffic arteries which reserved large garden areas exclusively for pedestrians, freeing them from the traffic congestion and offering the pleasant and relaxed atmosphere required. The large «campus» ¡s sub-divided into different garden areas, connected one to the other and In the center of each one we have a varied and complementary architecture, which breaks with the conventional monolithic style.Situado a 1.500 m del centro de la ciudad, el conjunto universitario de Leeds participa activamente de la misma. Concebido en la década de los 60, y construido posteriormente, este complejo arquitectónico se destaca por encerrar una propuesta de ciudad «ideal», perfectamente integrada en la ciudad «real» y sujeta en buena medida a sus mismas necesidades. La alternativa de convertir a este conjunto universitario, con capacidad para 10.000 estudiantes, en un núcleo urbano arquitectónicamente atractivo, tropezó inicialmente con los condicionamientos surgidos del tráfico, circulación y estacionamiento de vehículos, correspondientes a la magnitud de la ciudad proyectada; impedimento que fue resuelto de forma adecuada mediante la organización de una red subterránea y superficial de circulación vehicular, que reserva grandes espacios verdes para la circulaci

  7. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  8. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  9. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  10. LEED conformity inside and outside. Headquarters of the Deutsche Boerse in LEED {sup registered} Platin; Aussen und innen LEED-konform. Zentrale der Deutschen Boerse in LEED {sup registered} Platin

    Energy Technology Data Exchange (ETDEWEB)

    Bloedorn, Heike

    2011-07-01

    Due to the sustainable design of building, the new corporate headquarters of the group German Stock Exchange in Frankfurt (Federal Republic of Germany) with an investment of nearly 230 million Euro has been distinguished as the first skyscraper in Germany with the LEED platinum certification (as the highest category of the U.S. Green Building Council). The design of the 21-story building was by the architects KSP Engel Juergen Architekten GmbH (Frankfurt, Federal Republic of Germany). The project was developed by Gross Partner and Grundstuecksentwicklungsgesellschaft mbH (Frankfurt, Federal Republic of Germany) and Lang and Cie. Real Estate (Frankfurt, Federal Republic of Germany). The innovative energy concept was realized by Lenz Weber Ingenieure (Frankfurt, Federal Republic of Germany) in cooperation with EB-Partner GmbH and Co. KG (Frankfurt/Nuernberg, Federal Republic) and TP Electrical Plan (Gaggenau, Federal Republic of Germany). The stringent sustainability criteria in the interior construction were accomplished by feco wood materials in the form of displaceable system partition walls and sound absorbing doors.

  11. Study of the local structure of binary surfaces by electron diffraction (XPS, LEED)

    OpenAIRE

    Gereová, Katarína

    2006-01-01

    Study of local structure of binary surface with usage of ultra-thin film of cerium deposited on a Pd (111) single-crystal surface is presented. X-ray photoelectron spectroscopy and diffraction (XPS, XPD), angle resolved UV photoemission spectroscopy (ARUPS) and low energy electron diffraction (LEED) was used for our investigations. LEED and X-ray excited photoemission intensities results represent a surface-geometrical structure. As well, mapping of ultra-violet photoelectron intensities as a...

  12. Radiofrequency electromagnetic fields in the Cookridge area of Leeds

    International Nuclear Information System (INIS)

    Fuller, K.; Gulson, A.D.; Judd, P.M.; Lowe, A.J.; Shaw, J.

    2002-01-01

    On the 8 and 9 May 2002 representatives of the National Radiological Protection Board (NRPB) performed a radiofrequency electromagnetic field survey in the Cookridge area of Leeds in order to assess exposure to radio signals from transmitters mounted on a water tower/a lattice tower and a radio station tower. Guidelines on limiting exposure to radio signals have been published by NRPB and the International Commission on Non-Ionizing Radiation Protection (ICNIRP). These guidelines are designed to prevent established adverse effects on human health. During this survey, the total exposures due to all radio signals from 30 MHz to 18000 MHz (18 GHz) were measured. This frequency range was chosen as it includes mobile phone base station transmissions, which are at around 900 and 1800 MHz and super high frequency (SHF) transmissions from most of the large microwave dish antennas mounted on the towers. In addition, other major sources of radiofrequency electromagnetic fields in the environment such as broadcast radio and television transmissions are included in this range. Measurements of power density were made at eight locations in the vicinity of the transmitter sites. Comparison of the measurements with the guidelines showed that the total exposure from radio signals measured between 30 MHz and 18 GHz ranged from 0.26 millionths (0.000026%) to 190 millionths (0.019%) of the NRPB investigation level and from 1.6 millionths (0.00016%) to 1400 millionths (0.14%) of the ICNIRP reference level for exposure of the general public. All the measured exposures are therefore many times below guideline levels and are not considered hazardous. (author)

  13. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  14. Estimating the parameters of stochastic differential equations using a criterion function based on the Kolmogorov-Smirnov statistic

    OpenAIRE

    McDonald, A. David; Sandal, Leif Kristoffer

    1998-01-01

    Estimation of parameters in the drift and diffusion terms of stochastic differential equations involves simulation and generally requires substantial data sets. We examine a method that can be applied when available time series are limited to less than 20 observations per replication. We compare and contrast parameter estimation for linear and nonlinear first-order stochastic differential equations using two criterion functions: one based on a Chi-square statistic, put forward by Hurn and Lin...

  15. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.

    1991-01-01

    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... to detect analytical errors by comparing the a priori precision of the analytical results with the actual variability observed among replicates or duplicates. The method relies on the chi-square distribution to detect excess variability and is quite sensitive even for 5-10 results. Interference control...... serves to detect analytical bias by comparing results obtained by two different analytical methods, each relying on a different detection principle and therefore exhibiting different influence from matrix elements; only 5-10 sets of results are required to establish whether a regression line passes...

  16. BrightStat.com: free statistics online.

    Science.gov (United States)

    Stricker, Daniel

    2008-10-01

    Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.

  17. Statistical insights from Romanian data on higher education

    Directory of Open Access Journals (Sweden)

    Andreea Ardelean

    2015-09-01

    Full Text Available This paper aims to use cluster analysis to make a comparative analysis at regional level concerning the Romanian higher education. The evolution of higher education in post-communist period will also be presented, using quantitative traits. Although the focus is on university education, this will also include references to the total education by comparison. Then, to highlight the importance of higher education, the chi-square test will be applied to check whether there is an association between statistical regions and education level of the unemployed.

  18. R for statistics

    CERN Document Server

    Cornillon, Pierre-Andre; Husson, Francois; Jegou, Nicolas; Josse, Julie; Kloareg, Maela; Matzner-Lober, Eric; Rouviere, Laurent

    2012-01-01

    An Overview of RMain ConceptsInstalling RWork SessionHelpR ObjectsFunctionsPackagesExercisesPreparing DataReading Data from FileExporting ResultsManipulating VariablesManipulating IndividualsConcatenating Data TablesCross-TabulationExercisesR GraphicsConventional Graphical FunctionsGraphical Functions with latticeExercisesMaking Programs with RControl FlowsPredefined FunctionsCreating a FunctionExercisesStatistical MethodsIntroduction to the Statistical MethodsA Quick Start with RInstalling ROpening and Closing RThe Command PromptAttribution, Objects, and FunctionSelectionOther Rcmdr PackageImporting (or Inputting) DataGraphsStatistical AnalysisHypothesis TestConfidence Intervals for a MeanChi-Square Test of IndependenceComparison of Two MeansTesting Conformity of a ProportionComparing Several ProportionsThe Power of a TestRegressionSimple Linear RegressionMultiple Linear RegressionPartial Least Squares (PLS) RegressionAnalysis of Variance and CovarianceOne-Way Analysis of VarianceMulti-Way Analysis of Varian...

  19. Integrating Building Information Modeling and Green Building Certification: The BIM-LEED Application Model Development

    Science.gov (United States)

    Wu, Wei

    2010-01-01

    Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…

  20. Regional Variations of Credits Obtained by LEED 2009 Certified Green Buildings—A Country Level Analysis

    Directory of Open Access Journals (Sweden)

    Peng Wu

    2017-12-01

    Full Text Available Leadership in Energy and Environmental Design (LEED is one of the most widely recognized green building rating systems. With more than 20% of the projects certified in non-United States (US countries, LEED’s global impact has been increasing and it is critically important for developers and regulatory authorities to understand LEED’s performance at the country level to facilitate global implementation. This study therefore aims to investigate the credit achievement pattern of LEED 2009, which is one of the well-developed versions of LEED, by using 4021 certified projects in the US, China, Turkey, and Brazil. The results show that significant differences can be identified on most rating categories, including sustainable sites, water efficiency, energy and atmosphere, indoor environmental quality, and innovation in design. Using a post hoc analysis, country-specific credit allocation patterns are also identified to help developers to understand existing country-specific green building practices. In addition, it is also found that there is unbalanced achievement of regional priority credits. The study offers a useful reference and benchmark for international developers and contractors to understand the regional variations of LEED 2009 and for regulatory authorities, such as the U.S. Green Building Council, to improve the rating system, especially on designing regional priority credits.

  1. Evaluation of the Work-Place Cooperative Project in Geography Degrees at the University of Leeds.

    Science.gov (United States)

    Hogg, James

    1998-01-01

    Describes the context and objective of a Work-Place Cooperative Project (WPCP) established in the School of Geography at the University of Leeds in 1995. The project presents students with business, commerce, industry, and environmental research issues that have geographical dimensions. Includes a number of examples from the WPCP. (MJP)

  2. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  3. Academic Training: Practical Statistics for Particle Physicists

    CERN Multimedia

    2006-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 9, 10, 11, 12, 13 October from 11:00 to 12:00 - Main Auditorium, bldg. 500, TH Auditorium, bldg 4, 3rd floor, on 13 October Practical Statistics for Particle Physicists L. LYONS, University of Oxford, GB Lecture 1: Learning to love the errror matrix Introductory remarks. Conditional probability. Statistical and systematic errors. Combining results Binomial, Poisson and 1-D Gaussian 2-D Gaussian and the error matrix. Understanding the covariance. Using the error matrix. Estimating the error matrix. Combining correlated measurements Lecture 2: Parameter determination by likelihood: Do's and don'ts Introduction to likelihood. Error estimate. Simple examples: (1) Breit Wigner (2) Lifetime binned and unbinned likelihood several parameters extended maximum likelihood. Common misapprehensions: Normalisation delta(lnL) = 1/2 rule and coverage Integrating the likelihood Unbinned L_max as goodness of fit Punzi effect Lecture 3: Chi-squared and hypothesis test...

  4. The power of joint application of LEED and DFT in quantitative surface structure determination

    International Nuclear Information System (INIS)

    Heinz, K; Hammer, L; Mueller, S

    2008-01-01

    It is demonstrated for several cases that the joint application of low-energy electron diffraction (LEED) and structural calculations using density functional theory (DFT) can retrieve the correct surface structure even though single application of both methods fails. On the experimental side (LEED) the failure can be due to the simultaneous presence of weak and very strong scatterers or to an insufficient data base leaving different structures with the same quality of fit between experimental data and calculated model intensities. On the theory side (DFT) it can be difficult to predict the coverage of an adsorbate or two different structures may own almost the same total energy, but only one of the structures is assumed in experiment due to formation kinetics. It is demonstrated how in the different cases the joint application of both methods-which yield about the same structural precision-offers a way out of the dilemma

  5. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Science.gov (United States)

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  6. Assessment of Energy Credits in LEED-Certified Buildings Based on Certification Levels and Project Ownership

    Directory of Open Access Journals (Sweden)

    Asli Pelin Gurgun

    2018-02-01

    Full Text Available Compared to other categories, the Energy and Atmosphere category contributes the most to the maximum obtainable points in the Leadership in Energy and Environmental Design (LEED certification system. The objective of the study was to identify the extent to which project teams take advantage of the credits in the Energy and Atmosphere category of LEED. This study analyzes the performance of practitioners in achieving points in the Energy and Atmosphere credits of LEED-New Construction (NC 2009 for 1500 buildings that received LEED certification in the US. For a better understanding of the credit patterns, the differences in the performance of practitioners are investigated relative to certification levels and project ownership. Achievement in credits is calculated in terms of percent of maximum points (PMP, since the maximum achievable points differ for each credit. Practitioners’ achievements in the credits were ranked as follows: (1 enhanced commissioning, (2 optimized energy performance, (3 enhanced refrigerant management, (4 green power, (5 measurement and verification, and (6 on-site renewable energy. The largest achievement differences were observed in the on-site renewable energy credit. Concerning building ownership, investors were found to optimize mostly energy efficiency and on-site renewable energy, but to mostly skip enhanced refrigerant management. Performance in the measurement and verification credit was similar for all owner types, whereas investors performed differently from corporations, and government agencies in the enhanced commissioning credit. Practitioners who recognize these priorities and differences are expected to be better positioned to make sustainability-related decisions in building design and construction.

  7. Indoor environmental quality differences between office types in LEED-certified buildings in the US

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young S. [School of Planning, Design, and Construction, Michigan State University, East Lansing, MI 48823 (United States); Guerin, Denise A. [College of Design, University of Minnesota, Twin Cities, MN 55108 (United States)

    2010-05-15

    The study compared IAQ, thermal quality, and lighting quality between 5 different office types in LEED-certified buildings in relation to employees' environmental satisfaction and their job performance. This was to provide workplaces where workers in each specific office environment could be provided with appropriate office settings regarding these IEQ criteria when organizations comply with LEED standards. The five types of office included private enclosed, private shared, open-plan with high cubicle over 5', open-plan with low cubicle lower than 5', and open-plan with no partitions (bullpen) offices. The study found IAQ enhanced workers' job performance in enclosed private offices more than both high cubicles and low cubicles. All four office types had higher satisfaction with the amount of light and visual comfort of light as well as more enhancement with job performance due to lighting quality than high cubicles. There was no difference in thermal quality between the five office types. IAQ and lighting quality were not different between enclosed private, enclosed shared, and bullpen office types, either. The study findings suggest a careful workplace design considering the height of partitions in LEED-certified buildings to improve employee's environmental satisfaction and job performance. (author)

  8. The association between the geography of fast food outlets and childhood obesity rates in Leeds, UK.

    Science.gov (United States)

    Fraser, Lorna K; Edwards, Kimberley L

    2010-11-01

    To analyse the association between childhood overweight and obesity and the density and proximity of fast food outlets in relation to the child's residential postcode. This was an observational study using individual level height/weight data and geographic information systems methodology. Leeds in West Yorkshire, UK. This area consists of 476 lower super-output areas. Children aged 3-14 years who lived within the Leeds metropolitan boundaries (n=33,594). The number of fast food outlets per area and the distance to the nearest fast food outlet from the child's home address. The weight status of the child: overweight, obese or neither. 27.1% of the children were overweight or obese with 12.6% classified as obese. There is a significant positive correlation (pfood outlets and higher deprivation. A higher density of fast food outlets was significantly associated (p=0.02) with the child being obese (or overweight/obese) in the generalised estimating equation model which also included sex, age and deprivation. No significant association between distance to the nearest fast food outlet and overweight or obese status was found. There is a positive relationship between the density of fast food outlets per area and the obesity status of children in Leeds. There is also a significant association between fast food outlet density and areas of higher deprivation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Investigation of reordered (001) Au surfaces by positive ion channeling spectroscopy, LEED and AES

    International Nuclear Information System (INIS)

    Appleton, B.R.; Noggle, T.S.; Miller, J.W.; Schow, O.E. III; Zehner, D.M.; Jenkins, L.H.; Barrett, J.H.

    1974-01-01

    As a consequence of the channeling phenomenon of positive ions in single crystals, the yield of ions Rutherford scattered from an oriented single crystal surface is dependent on the density of surface atoms exposed to the incident ion beam. Thus, the positive ion channeling spectroscopy (PICS) technique should provide a useful tool for studying reordered surfaces. This possibility was explored by examining the surfaces of epitaxially grown thin Au single crystals with the combined techniques of LEED-AES and PICS. The LEED and AES investigations showed that when the (001) surface was sputter cleaned in ultra-high vacuum, the normal (1 x 1) symmetry of the (001) surfaces reordered into a structure which gave a complex (5 x 20) LEED pattern. The yield and energy distributions of 1 MeV He ions scattered from the Au surfaces were used to determine the number of effective monolayers contributing to the normal and reordered surfaces. These combined measurements were used to characterize the nature of the reordered surface. The general applicability of the PICS technique for investigations of surface and near surface regions is discussed

  10. Walkable new urban LEED_Neighborhood-Development (LEED-ND community design and children's physical activity: selection, environmental, or catalyst effects?

    Directory of Open Access Journals (Sweden)

    Stevens, Robert B

    2011-12-01

    Full Text Available Abstract Background Interest is growing in physical activity-friendly community designs, but few tests exist of communities explicitly designed to be walkable. We test whether students living in a new urbanist community that is also a pilot LEED_ND (Leadership in Energy and Environmental Design-Neighborhood Development community have greater accelerometer-measured moderate-to-vigorous physical activity (MVPA across particular time periods compared to students from other communities. We test various time/place periods to see if the data best conform to one of three explanations for MVPA. Environmental effects suggest that MVPA occurs when individuals are exposed to activity-friendly settings; selection effects suggest that walkable community residents prefer MVPA, which leads to both their choice of a walkable community and their high levels of MVPA; catalyst effects occur when walking to school creates more MVPA, beyond the school commute, on schooldays but not weekends. Methods Fifth graders (n = 187 were sampled from two schools representing three communities: (1 a walkable community, Daybreak, designed with new urbanist and LEED-ND pilot design standards; (2 a mixed community (where students lived in a less walkable community but attended the walkable school so that part of the route to school was walkable, and (3 a less walkable community. Selection threats were addressed through controlling for parental preferences for their child to walk to school as well as comparing in-school MVPA for the walkable and mixed groups. Results Minutes of MVPA were tested with 3 × 2 (Community by Gender analyses of covariance (ANCOVAs. Community walkability related to more MVPA during the half hour before and after school and, among boys only, more MVPA after school. Boys were more active than girls, except during the half hour after school. Students from the mixed and walkable communities--who attended the same school--had similar in-school MVPA levels, and

  11. Walkable new urban LEED_Neighborhood-Development (LEED-ND) community design and children's physical activity: selection, environmental, or catalyst effects?

    Science.gov (United States)

    2011-01-01

    Background Interest is growing in physical activity-friendly community designs, but few tests exist of communities explicitly designed to be walkable. We test whether students living in a new urbanist community that is also a pilot LEED_ND (Leadership in Energy and Environmental Design-Neighborhood Development) community have greater accelerometer-measured moderate-to-vigorous physical activity (MVPA) across particular time periods compared to students from other communities. We test various time/place periods to see if the data best conform to one of three explanations for MVPA. Environmental effects suggest that MVPA occurs when individuals are exposed to activity-friendly settings; selection effects suggest that walkable community residents prefer MVPA, which leads to both their choice of a walkable community and their high levels of MVPA; catalyst effects occur when walking to school creates more MVPA, beyond the school commute, on schooldays but not weekends. Methods Fifth graders (n = 187) were sampled from two schools representing three communities: (1) a walkable community, Daybreak, designed with new urbanist and LEED-ND pilot design standards; (2) a mixed community (where students lived in a less walkable community but attended the walkable school so that part of the route to school was walkable), and (3) a less walkable community. Selection threats were addressed through controlling for parental preferences for their child to walk to school as well as comparing in-school MVPA for the walkable and mixed groups. Results Minutes of MVPA were tested with 3 × 2 (Community by Gender) analyses of covariance (ANCOVAs). Community walkability related to more MVPA during the half hour before and after school and, among boys only, more MVPA after school. Boys were more active than girls, except during the half hour after school. Students from the mixed and walkable communities--who attended the same school--had similar in-school MVPA levels, and community groups

  12. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  13. The Leeds Evaluation of Efficacy of Detoxification Study (LEEDS project: An open-label pragmatic randomised control trial comparing the efficacy of differing therapeutic agents for primary care detoxification from either street heroin or methadone [ISRCTN07752728

    Directory of Open Access Journals (Sweden)

    Sheard Laura

    2004-04-01

    Full Text Available Abstract Background Heroin is a synthetic opioid with an extensive illicit market leading to large numbers of people becoming addicted. Heroin users often present to community treatment services requesting detoxification and in the UK various agents are used to control symptoms of withdrawal. Dissatisfaction with methadone detoxification 8 has lead to the use of clonidine, lofexidine, buprenorphine and dihydrocodeine; however, there remains limited evaluative research. In Leeds, a city of 700,000 people in the North of England, dihydrocodeine is the detoxification agent of choice. Sublingual buprenorphine, however, is being introduced. The comparative value of these two drugs for helping people successfully and comfortably withdraw from heroin has never been compared in a randomised trial. Additionally, there is a paucity of research evaluating interventions among drug users in the primary care setting. This study seeks to address this by randomising drug users presenting in primary care to receive either dihydrocodeine or buprenorphine. Methods/design The Leeds Evaluation of Efficacy of Detoxification Study (LEEDS project is a pragmatic randomised trial which will compare the open use of buprenorphine with dihydrocodeine for illicit opiate detoxification, in the UK primary care setting. The LEEDS project will involve consenting adults and will be run in specialist general practice surgeries throughout Leeds. The primary outcome will be the results of a urine opiate screening at the end of the detoxification regimen. Adverse effects and limited data to three and six months will be acquired.

  14. The Use of SPSS (Statistical Package for the Social Sciences in Political Science: a brief introduction

    Directory of Open Access Journals (Sweden)

    Mauro Meirelles

    2014-06-01

    Full Text Available This paper deals with some basic concepts of statistics a few analysis methods and their usefulness for researchers from various fields. In particular we focus our argument on how SPSS can be used with a view to carrying out a large number of inferences from certain set of quantitative data. Above all, we seek through these basic concepts and handling of said software routines equip, minimally, those who wish to venture quantitative analysis. In particular, we are concerned the use of the Pearson Correlation Test, Chi-Square Test and Analysis of Simple and Multiple Regression.

  15. Mining Association Rules Between Credits in the Leadership in Energy and Environmental Design for New Construction (LEED-NC) Green Building Assessment System

    National Research Council Canada - National Science Library

    Thomas, Benjamin J

    2008-01-01

    .... Taking this vision into account, the individual credits that comprise LEED are designed to reward design teams for employing sustainable design strategies that reduce the total environmental impact...

  16. Statistical analysis of Thematic Mapper Simulator data for the geobotanical discrimination of rock types in southwest Oregon

    Science.gov (United States)

    Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.

    1984-01-01

    An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.

  17. Darning, doylies and dancing: the work of the Leeds Association of Girls' Clubs (1904-1913).

    Science.gov (United States)

    Jones, Helen M F

    2011-01-01

    The Leeds Association of Girls' Clubs (LAGC) was set up by a group of women, including Hilda Hargrove, Dr Lucy Buckley and Mary and Margaret Harvey, to promote collaboration between the city's girls' clubs. The organisation epitomised women working in partnership whilst reflecting their differing philanthropic and political interests. However LAGC's collaborative approach resulted in liberal consensus which downplayed the significance of girls' working conditions. Throughout the decade LAGC's focus was its annual competitions. These featured utilitarian and decorative handicrafts (darning and doylies) enshrining both frugality and aspiration, alongside dance and drill which channelled girls' vigour. Nevertheless, LAGC's resilience resulted in an organisation which is still in existence.

  18. Learning Word Embeddings with Chi-Square Weights for Healthcare Tweet Classification

    Directory of Open Access Journals (Sweden)

    Sicong Kuang

    2017-08-01

    Full Text Available Twitter is a popular source for the monitoring of healthcare information and public disease. However, there exists much noise in the tweets. Even though appropriate keywords appear in the tweets, they do not guarantee the identification of a truly health-related tweet. Thus, the traditional keyword-based classification task is largely ineffective. Algorithms for word embeddings have proved to be useful in many natural language processing (NLP tasks. We introduce two algorithms based on an existing word embedding learning algorithm: the continuous bag-of-words model (CBOW. We apply the proposed algorithms to the task of recognizing healthcare-related tweets. In the CBOW model, the vector representation of words is learned from their contexts. To simplify the computation, the context is represented by an average of all words inside the context window. However, not all words in the context window contribute equally to the prediction of the target word. Greedily incorporating all the words in the context window will largely limit the contribution of the useful semantic words and bring noisy or irrelevant words into the learning process, while existing word embedding algorithms also try to learn a weighted CBOW model. Their weights are based on existing pre-defined syntactic rules while ignoring the task of the learned embedding. We propose learning weights based on the words’ relative importance in the classification task. Our intuition is that such learned weights place more emphasis on words that have comparatively more to contribute to the later task. We evaluate the embeddings learned from our algorithms on two healthcare-related datasets. The experimental results demonstrate that embeddings learned from the proposed algorithms outperform existing techniques by a relative accuracy improvement of over 9%.

  19. Computation of the Percentage Points of the Chi-Square Distribution

    Science.gov (United States)

    1977-04-01

    0D.4 ’ 2th cW%(F. ffA ’*. .0 Dp 46 v Ne #% N’ c %c 0% rmsn",0a0r fWN ..an W% *SA%0 0 9. M do Q at Ge .4.4 ~N P) ) AIII0 t- t.. 40 so(PO cC3.4..4 W N...mnaI.0% N 0 M4 4 o M wp)Q0-bIP)0 4WD = e bbbbbbbbbb Pbbbbbbbbbb P P)~ ~ PIbb P1P1P MM"MP1P bPP P) PbP) rl# )m m tobbIbbb~~~ Pa~ ~ ~~~ 0% 00bb~~ Pb...NNNNNNNNN~N NNNNNNNNNN NNNNNNNNNN a 0 " C3 C r- 0c)(ok"t ommC)Ca-,O mo Ii"ClC)7 ma1 C),CNCCMC. MC aCCA 13 M Co 0 M))CCCC) C3 4 MN.0 .40N 4’D 400"OagN.DU

  20. Results from the Cryogenic Dark Matter Search Using a Chi Squared Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sander, Joel [UC, Santa Barbara

    2007-12-01

    Most of the mass-energy density of the universe remains undetected and is only understood through its affects on visible, baryonic matter. The visible, baryonic matter accounts for only about half of a percent of the universe's total mass-energy budget, while the remainder of the mass-energy of the universe remains dark or undetected. About a quarter of the dark mass-energy density of the universe is comprised of massive particles that do not interact via the strong or electromagnetic forces. If these particles interact via the weak force, they are termed weakly interacting massive particles or WIMPs, and their interactions with baryonic matter could be detectable. The CDMS II experiment attempts to detect WIMP interactions in the Soudan Underground Laboratory using germanium detectors and silicon detectors. A WIMP can interact a with detector nuclei causing the nuclei to recoil. A nuclear recoil is distinguished from background electron recoils by comparing the deposited ionization and phonon energies. Electron recoils occurring near detector surfaces are more difficult to reject. This thesis describes the results of a χ2 analysis designed to reject events occurring near detector surfaces. Because no WIMP signal was observed, separate limits using the germanium and silicon detectors are set on the WIMP cross section under standard astrophysical assumptions.

  1. A study of the effects of computer animation on college students’ learning of Leadership in Energy and Environmental Design - LEED

    Directory of Open Access Journals (Sweden)

    Razieh Nilforooshan

    2013-10-01

    Full Text Available This paper presents ongoing research aimed at investigating the efficacy of computer animations in improving college students’ learning of building sustainability concepts and practices. The use of animations in educational contexts is not new, however scientific evidence that supports their effectiveness as educational materials is still limited. This paper reports an experiment that explored the impact of an educational digital animation, called “LEED-ERS”, on college students’ learning of Leadership in Energy and Environmental Design (LEED rating system. Specifically, the animation focused on the LEED category of Sustainable Site. Results of a study with 68 students show that viewing the animation led to an increase in subjects’ declarative knowledge by 15%. Compared to traditional learning methods (e.g. reading assignments with static images, viewing the animation led to significantly higher declarative knowledge gains.

  2. LEED crystallography studies of the structure of clean and adsorbate-covered Ir, Pt and Rh crystal surfaces

    International Nuclear Information System (INIS)

    Koestner, R.J.

    1982-08-01

    There have only been a few Low Energy Electron Diffraction (LEED) intensity analyses carried out to determine the structure of molecules adsorbed on metal surfaces; most surface crystallography studies concentrated on the structure of clean unreconstructed or atomic adsorbate-covered transition metal faces. The few molecular adsorption systems already investigated by dynamical LEED are CO on Ni(100), Cu(100) and Pd(100) as well as C 2 H 2 and C 2 H 4 adsorbed on Pt(111). The emphasis of this thesis research has been to extend the applicability of LEED crystallography to the more complicated unit cells found in molecular overlayers on transition metals or in there constructed surfaces of clean transition metals

  3. LEED crystallography studies of the structure of clean and adsorbate-covered Ir, Pt and Rh crystal surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Koestner, R.J.

    1982-08-01

    There have only been a few Low Energy Electron Diffraction (LEED) intensity analyses carried out to determine the structure of molecules adsorbed on metal surfaces; most surface crystallography studies concentrated on the structure of clean unreconstructed or atomic adsorbate-covered transition metal faces. The few molecular adsorption systems already investigated by dynamical LEED are CO on Ni(100), Cu(100) and Pd(100) as well as C/sub 2/H/sub 2/ and C/sub 2/H/sub 4/ adsorbed on Pt(111). The emphasis of this thesis research has been to extend the applicability of LEED crystallography to the more complicated unit cells found in molecular overlayers on transition metals or in there constructed surfaces of clean transition metals.

  4. Characterization of Si(112) and In/Si(112) studied by SPA-LEED

    Energy Technology Data Exchange (ETDEWEB)

    Hoecker, Jan; Speckmann, Moritz; Schmidt, Thomas; Falta, Jens [Institute of Solid State Physics, University of Bremen, 28359 Bremen (Germany)

    2010-07-01

    High index surfaces are of strong interest in todays research because of the possibility to grow low dimensional structures. It has for instance already been shown that the adsorption of Ga can induce the formation of 1D metal chains on Si(112) (cf. Snijders et al., PRB 72, 2005). In this work we investigated the clean Si(112) surface and the adsorption of In on Si(112) to establish an analogy to Ga/Si(112) using spot profile analyzing low energy electron diffraction (SPA-LEED). By means of reciprocal space mapping we determined the bare Si(112) surface to be decomposed into alternating (5512) and (111) facets in [1 anti 10] direction with (2 x 1) and (7 x 7) reconstruction, respectively (cf. Baski et al., Surf. Sci. 392, 1997). With SPA-LEED we were able to observe the decreasing intensity of the facet spots in-situ while depositing In on Si(112) and thus reveal the smoothening of the surface due to the deposition of In. At saturation coverage we found a (3.x1) reconstruction, where x is dependent on the deposition temperature and changes from x=7 at 400 C to x=5 at 500 C. This leads us to the assumption that the reconstruction is not incommensurate but a mixture of (3 x 1) and (4 x 1) building blocks, which is very similar to the super structure of Ga on Si(112).

  5. Test for the statistical significance of differences between ROC curves

    International Nuclear Information System (INIS)

    Metz, C.E.; Kronman, H.B.

    1979-01-01

    A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions

  6. A statistical method for testing epidemiological results, as applied to the Hanford worker population

    International Nuclear Information System (INIS)

    Brodsky, A.

    1979-01-01

    Some recent reports of Mancuso, Stewart and Kneale claim findings of radiation-produced cancer in the Hanford worker population. These claims are based on statistical computations that use small differences in accumulated exposures between groups dying of cancer and groups dying of other causes; actual mortality and longevity were not reported. This paper presents a statistical method for evaluation of actual mortality and longevity longitudinally over time, as applied in a primary analysis of the mortality experience of the Hanford worker population. Although available, this method was not utilized in the Mancuso-Stewart-Kneale paper. The author's preliminary longitudinal analysis shows that the gross mortality experience of persons employed at Hanford during 1943-70 interval did not differ significantly from that of certain controls, when both employees and controls were selected from families with two or more offspring and comparison were matched by age, sex, race and year of entry into employment. This result is consistent with findings reported by Sanders (Health Phys. vol.35, 521-538, 1978). The method utilizes an approximate chi-square (1 D.F.) statistic for testing population subgroup comparisons, as well as the cumulation of chi-squares (1 D.F.) for testing the overall result of a particular type of comparison. The method is available for computer testing of the Hanford mortality data, and could also be adapted to morbidity or other population studies. (author)

  7. The two-glass-building in Ratingen. LEED platin for the Coca-Cola headquarter; Das Zwei-Scheiben-Haus in Ratingen. LEED-Platin fuer die Coca-Cola-Zentrale

    Energy Technology Data Exchange (ETDEWEB)

    Zerres, Eberhard

    2011-07-01

    In order to receive the eco-labeled LEED platinum category, a good planning is essential. In the construction of a new administration building in Ratingen (Federal Republic of Germany), many details have been considered up to the use of ecologically unquestionable building materials. Thus, these details were very purposeful.

  8. Which statistics should tropical biologists learn?

    Science.gov (United States)

    Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián

    2011-09-01

    Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.

  9. Uno strumento per la creazione di valore nella realizzazione di edifici sostenibili: la certificazione LEED

    Directory of Open Access Journals (Sweden)

    S. Rick Fedrizzi

    2014-06-01

    Full Text Available Il presente lavoro ha l’obiettivo di delineare gli aspetti chiave della sostenibilità in ambito edilizio focalizzando l’attenzione sul sistema di certificazione LEED® quale strumento “universale” di supporto per la realizzazione, gestione e valutazione di edifici sostenibili. Nella prima parte del lavoro si descrive la rapida diffusione della certificazione LEED nel recente passato quale diretta conseguenza della capacità di questo strumento di rating di adattarsi sia alle specifiche tipologie di edifici, sia alle diversità climatiche e morfologiche dei siti. Nella seconda parte si procede invece a presentare ed analizzare gli aspetti economico-finanziari degli edifici sostenibili con riferimento sia alle metodologie valutative applicabili, sia ai dati della letteratura. Partendo dalle esperienze internazionali in tema di sostenibilità, si procede successivamente a descrivere la situazione italiana, evidenziando la percezione del mercato e le opportunità di sviluppo future.

  10. “THE LEEDS IDEA”: AN HISTORICAL ACCOUNT OF THE SPONDARTHRITIS CONCEPT

    Directory of Open Access Journals (Sweden)

    J.M.H. Moll

    2011-09-01

    Full Text Available SUMMARY In the 1960s, Professor Verna Wright became increasingly interested in possible relationships between certain seronegative “variants of rheumatoid arthritis”, as they were then generally known. At the Rheumatism Research Unit, a department within the division of medicine at Leeds University, he gathered around him a succession of research workers, whom he inspired to study aspects of these relationships. The focus was on family studies, as it was thought that genetic factors could be important. The striking association previously noted between sacroiliitis or full-blown ankylosing spondylitis and several of these disorders to be studied - e.g., psoriatic arthritis, ulcerative colitis, and the arthritis associated with Crohn’s disease - was to be central for each of these studies. As a provisional collective name for these possibly related conditions, the term “Spondarthritides” was chosen. These were the days before HLA B27, and so the research tools were simply clinical, radiological (for sacroiliitis and serological (for rheumatoid factor. The research programme confirmed not only links between the primary disorders with ankylosing spondylitis, but also links between the disorders themselves. Over subsequent years, the spondarthritis concept (dubbed by some “The Leeds Idea” has gained further strength from HLA studies internationally. And membership of the group of conditions fulfilling spondarthritis criteria has grown substantially. It is hoped that this now consolidated framework of spondylitis-related entities will pave the way for further research, with exciting prospects of gene-based prevention and/or cure through the increasing sophistication of molecular biology. Key words: Seronegative spondarthritides, psoriatic arthritis, ankylosing spndylitis

  11. Statistical analysis of surface lineaments and fractures for characterizing naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Genliang; George, S.A.; Lindsey, R.P.

    1997-08-01

    Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.

  12. Statistical analysis of natural radiation levels inside the UNICAMP campus through the use of Geiger-Muller counter

    International Nuclear Information System (INIS)

    Fontolan, Juliana A.; Biral, Antonio Renato P.

    2013-01-01

    It is known that the distribution at time intervals of random and unrelated events leads to the Poisson distribution . This work aims to study the distribution in time intervals of events resulting from radioactive decay of atoms present in the UNICAMP where activities involving the use of ionizing radiation are performed environments . The proposal is that the distribution surveys at intervals of these events in different locations of the university are carried out through the use of a Geiger-Mueller tube . In a next step , the evaluation of distributions obtained by using non- parametric statistics (Chi- square and Kolmogorov Smirnoff) will be taken . For analyzes involving correlations we intend to use the ANOVA (Analysis of Variance) statistical tool . Measured in six different places within the Campinas , with the use of Geiger- Muller its count mode and a time window of 20 seconds was performed . Through statistical tools chi- square and Kolmogorov Smirnoff tests, using the EXCEL program , it was observed that the distributions actually refer to a Poisson distribution. Finally, the next step is to perform analyzes involving correlations using the statistical tool ANOVA

  13. PLEASE: The Python Low-energy Electron Analysis SuitE – Enabling Rapid Analysis of LEEM and LEED Data

    Directory of Open Access Journals (Sweden)

    Maxwell Grady

    2018-02-01

    Full Text Available PLEASE, the Python Low-energy Electron Analysis SuitE, provides an open source and cross-platform graphical user interface (GUI for rapid analysis and visualization of low energy electron microscopy (LEEM data sets. LEEM and the associated technique, selected area micro-spot low energy electron diffraction (μ-LEED, are powerful tools for analysis of the surface structure for many novel materials. Specifically, these tools are uniquely suited for the characterization of two-dimensional materials. PLEASE offers a user-friendly point-and-click method for extracting intensity-voltage curves from LEEM and LEED data sets. Analysis of these curves provides insight into the atomic structure of the target material surface with unparalleled resolution.

  14. Mining Association Rules Between Credits in the Leadership in Energy and Environmental Design for New Construction (LEED-NC) Green Building Assessment System

    National Research Council Canada - National Science Library

    Thomas, Benjamin J

    2008-01-01

    The Leadership in Energy and Environmental Design (LEED) Building Assessment System is a performance-based tool for determining the environmental impact of a facility from the whole-building perspective...

  15. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  16. In the spotlight. Interview with Kenneth Lee, Health Economist, University of Leeds, U.K.. Interview by Johannes Stoelwinder.

    Science.gov (United States)

    Lee, K

    1984-01-01

    Ken Lee appointed to the staff of the Nuffield Centre, University of Leeds, as Lecturer in Health Economics in 1970. He is now Senior Lecturer and Director of the Master's Programme in Health Service Studies. His main teaching interests are in health planning and health economics, and he has carried out research and written extensively on approaches to health economics, health planning and management, care of the elderly, primary health care, health financing, and emergency health services.

  17. Translating research into practice in Leeds and Bradford (TRiPLaB: a protocol for a programme of research

    Directory of Open Access Journals (Sweden)

    Bibby John

    2010-05-01

    Full Text Available Abstract Background The National Institute for Health Research (NIHR has funded nine Collaborations for Leadership in Applied Health Research and Care (CLAHRCs. Each CLAHRC is a partnership between higher education institutions (HEIs and the NHS in nine UK regional health economies. The CLAHRC for Leeds, York, and Bradford comprises two 'research themes' and three 'implementation themes.' One of these implementation themes is Translating Research into Practice in Leeds and Bradford (TRiPLaB. TRiPLaB aims to develop, implement, and evaluate methods for inducing and sustaining the uptake of research knowledge into practice in order to improve the quality of health services for the people of Leeds and Bradford. Methods TRiPLaB is built around a three-stage, sequential, approach using separate, longitudinal case studies conducted with collaborating NHS organisations, TRiPLaB will select robust innovations to implement, conduct a theory-informed exploration of the local context using a variety of data collection and analytic methods, and synthesise the information collected to identify the key factors influencing the uptake and adoption of targeted innovations. This synthesis will inform the development of tailored, multifaceted, interventions designed to increase the translation of research findings into practice. Mixed research methods, including time series analysis, quasi-experimental comparison, and qualitative process evaluation, will be used to evaluate the impact of the implementation strategies deployed. Conclusion TRiPLaB is a theory-informed, systematic, mixed methods approach to developing and evaluating tailored implementation strategies aimed at increasing the translation of research-based findings into practice in one UK health economy. Through active collaboration with its local NHS, TRiPLaB aims to improve the quality of health services for the people of Leeds and Bradford and to contribute to research knowledge regarding the

  18. Proceedings of the International Workshop on Computational Electronics Held at Leeds University (United Kingdom) on August 11-13 1993

    Science.gov (United States)

    1993-08-01

    Drury , Sponsored by UK SERC and MIA-COM (USA). Apart from any fair dealing for the purposes of research or private study, or criticism or review, as...field effect 128 transistors R Drury , R E Miles and C M Snowden, University of Leeds Poster Session II Determination of diffusion coefficients and...V V V y V V V Device simulation by means of a direct solution of the coupled Poisson/Boltzmann Transport enuations Conor J. Donnelly and Colin Lyden

  19. Analysis of secondary electron emission for conducting materials using 4-grid LEED/AES optics

    International Nuclear Information System (INIS)

    Patino, M I; Wirz, R E; Raitses, Y; Koel, B E

    2015-01-01

    A facility utilizing 4-grid optics for LEED/AES (low energy electron diffraction/Auger electron spectroscopy) was developed to measure the total secondary electron yield and secondary electron energy distribution function for conducting materials. The facility and experimental procedure were validated with measurements of 50–500 eV primary electrons impacting graphite. The total yield was calculated from measurements of the secondary electron current (i) from the sample and (ii) from the collection assembly, by biasing each surface. Secondary electron yield results from both methods agreed well with each other and were within the spread of previous results for the total yield from graphite. Additionally, measurements of the energy distribution function of secondary electrons from graphite are provided for a wider range of incident electron energies. These results can be used in modeling plasma-wall interactions in plasmas bounded by graphite walls, such as are found in plasma thrusters, and divertors and limiters of magnetic fusion devices. (paper)

  20. Office layout affecting privacy, interaction, and acoustic quality in LEED-certified buildings

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young S. [School of Planning, Design, and Construction, Michigan State University, East Lansing, MI 48823 (United States)

    2010-07-15

    The study investigated differences in worker satisfaction and perceived job performance regarding privacy, interaction, and acoustic quality issues in personal workspaces between five office types in LEED-certified buildings. It finds that people in high cubicles showed significantly lower satisfaction and job performance in relation to visual privacy and interaction with co-workers than both enclosed private and enclosed shared office types. They also showed significantly lower satisfaction with noise level and sound privacy and lower job performance perceived by acoustic quality than enclosed private, enclosed shared, and bullpen types. The bullpen type, open-plan office without partitions, presented significantly higher satisfaction with noise level and higher performance perceived by acoustic quality than both high and low cubicles. Considering the bullpen type also showed higher satisfaction with sound privacy than the high cubicle type, high partitions don't seem to contribute to creating workspaces where people can have a secure conversation. The bullpen type didn't show any difference from the enclosed shared type in all privacy, interaction, and acoustic quality questions, indicating it may be a good option for a small office space instead of the enclosed shared type. (author)

  1. Validation study of the Leeds Dyspepsia Questionnaire in a multi-ethnic Asian population.

    Science.gov (United States)

    Mahadeva, Sanjiv; Chan, Wah-Kheong; Mohazmi, Mohammed; Sujarita, Ramanujam; Goh, Khean-Lee

    2011-11-01

    Outcome measures for clinical trials in dyspepsia require an assessment of symptom response. There is a lack of validated instruments assessing dyspepsia symptoms in the Asian region. We aimed to translate and validate the Leeds Dyspepsia Questionnaire (LDQ) in a multi-ethnic Asian population. A Malay and culturally adapted English version of the LDQ were developed according to established protocols. Psychometric evaluation was performed by assessing the validity, internal consistency, test-retest reliability and responsiveness of the instruments in both primary and secondary care patients. Between April and September 2010, both Malay (n=166) and Malaysian English (n=154) versions were assessed in primary and secondary care patients. Both language versions were found to be reliable (internal consistency was 0.80 and 0.74 (Cronbach's α) for Malay and English, respectively; spearman's correlation coefficient for test-retest reliability was 0.98 for both versions), valid (area under receiver operating curve for accuracy of diagnosing dyspepsia was 0.71 and 0.77 for Malay and English versions, respectively), discriminative (median LDQ score discriminated between primary and secondary care patients in Malay (11.0 vs 20.0, PAsian population with dyspepsia. © 2011 Journal of Gastroenterology and Hepatology Foundation and Blackwell Publishing Asia Pty Ltd.

  2. LEED, Its Efficacy and Fallacy in a Regional Context—An Urban Heat Island Case in California

    Directory of Open Access Journals (Sweden)

    Min Ho Shin

    2017-09-01

    Full Text Available The use of energy in the building sector has increased rapidly over the past two decades. Accordingly, various building assessment methods have developed in green building practices. However, the questions still remain in regard to how positively green buildings affect regional surroundings. This study investigates the possible relationship between LEED-certified buildings and urban heat island effect. Using GIS with spatial regression, the study found that constructing an LEED building in a 30-m boundary could possibly lower the temperature of the surrounding environment by 0.35 °C. Also, having a higher certification level, such as Gold or Platinum, increased the lowering effect by 0.48 °C, while a lower certification level, such as Certified or Silver, had a lowering effect of 0.26 °C. Although LEED has gained a substantial amount of interest and skepticism at the same time, the study results could be a potential sign that the Sustainable Sites Credits or energy-efficient materials play a positive role in lowering the temperature.

  3. Beneficios económicos de la certificación LEED. Edificio centro Ático: caso de estudio

    Directory of Open Access Journals (Sweden)

    Óscar Ribero

    Full Text Available En el presente artículo se estudian los beneficios económicos de la aplicación del programa de certificación de construcciones sostenibles LEED, en el Edificio Centro Ático situado en Bogotá - Colombia. Inicialmente, se determinan los consumos de agua y energía eléctrica, y los costos de construcción y operación del edificio bajo su diseño original (construido sin tener en cuenta los parámetros establecidos por LEED. Seguido a esto se plantean estrategias para lograr que el edificio Centro Ático alcance la certificación LEED GOLD New Construction V3 2009 y se calculan los incrementos económicos asociados a éstas. Así mismo, se calculan los nuevos consumos de agua y energía eléctrica bajo la aplicación de dichas estrategias (diseño modificado y sus correspondientes costos de operación. Finalmente, se determinan los indicadores de bondad económica de la inversión mediante un análisis de flujo de caja.

  4. Carbon Footprint of Housing in the Leeds City Region - A Best Practice Scenario Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, John; Dawkins, Elena (Stockholm Environment Inst. (Sweden))|(Univ. of York, Heslington, York YO10 5DD (United Kingdom))

    2008-06-15

    The Stockholm Environment Institute (SEI) was commissioned by the Environment Agency to carry out a carbon footprint analysis of the housing sector, using the Leeds City Region (LCR) as an example. The aim was to determine our ability to meet the 80 per cent by 2050 challenge of energy efficiency in the housing sector. The study relates specifically to LCR but its findings will help any planning and development teams make the right decisions and gain the resources necessary to meet carbon budgets at regional and local levels. With a growing population and an additional 263,000 housing units to be built within LCR by 2026, the housing sector would need to reduce its expected total carbon dioxide emissions by 38 million tonnes between 2010 and 2026 to be on track for 80 per cent savings in 2050. The report outlines the most detailed analysis to date of the required measures to deliver a growth-based regional housing strategy, alongside reducing carbon emissions. If the city region's new and existing housing is to attain the levels of energy efficiency necessary to deliver these carbon savings, big changes will be required in the way we build, maintain and run our homes over the next 20 years. There are pockets of good practice already in the region and the study shows that by combining innovative measures on construction standards, improvements to existing housing, low and zero carbon technologies and changing behaviour of householders, LCR can achieve the necessary savings to meet its carbon budget

  5. Social determinants of male health: a case study of Leeds, UK.

    Science.gov (United States)

    White, Alan; Seims, Amanda; Cameron, Ian; Taylor, Tim

    2018-01-19

    The social determinants of health have a disproportionate impact on mortality in men. A study into the state of health of the male population in Leeds was undertaken to guide public health commissioning decisions. This paper reports on the data relating to the social lives of men. A cross-sectional study was undertaken, comprising descriptive analysis of data relating to educational attainment, housing, employment (including benefit claimants), marital status and relationships. Data was considered for the whole city and localised at the Middle Super Output Area (MSOA) level and mapped against the Index of Deprivation. Boys' educational attainment was found to be lagging behind girls' from their earliest assessments (Early Years Foundation Stage Profile, 46% vs. 60%, P = 0.00) to GCSEs (53% vs. 63%, P = 0.00), leaving many men with no qualifications. There were 68% more men than women identified as being unemployed, with more men claiming benefits. Men living in social housing are more likely to be housed in high-rise flats. Almost 50% of men aged 16-64 are single, with 2254 lone fathers. There appears to be a lack of sex/gender analysis of current cross city data. In areas of deprivation a complex picture of multiple social problems emerges, with marked gender differences in the social determinants of health, with males seeming to be more negatively affected. There is a need for more focused planning for reaching out and targeting boys and men in the most deprived inner city areas, so that greater efficiency in service delivery can be obtained.

  6. Allbutt of Leeds and Duchenne de Boulogne: Newly discovered insights on Duchenne by a British neuropsychiatrist.

    Science.gov (United States)

    Reynolds, E H; Broussolle, E

    2018-02-01

    It is well-established that Guillaume-Benjamin-Amand Duchenne de Boulogne (1806-1875), and Jean-Martin Charcot (1825-1893) were the founding fathers of Parisian and French neurology during the second half of the 19th century, although much more is known about Charcot than about his "master" Duchenne. In Britain, Thomas Clifford Allbutt (1836-1925) was Leeds' most distinguished physician of the 19th century, eventually becoming Regius Professor of Physic at Cambridge. Allbutt's 1860-1861 year of postgraduate study in Paris and his friendship with Duchenne profoundly influenced his own contributions to nervous system and mental diseases, partly in collaboration with his colleague James Crichton-Browne (1840-1938) at the nearby West Riding Lunatic Asylum in Wakefield, Yorkshire. The present report briefly recalls the careers of Duchenne and Allbutt, and also presents a unique account by Allbutt of Duchenne in action at the height of his powers, investigating and defining the previously uncharted field of neuromuscular diseases with the aid of his localized electrization techniques. This account is discussed in relation to: Duchenne's personality and pioneering neurological achievements; the origins of French neurology; and the development of Anglo-French neurological relationships during the 19th century. Interestingly, both Duchenne and Crichton-Browne separately made important and much-appreciated contributions to the third major book by Charles Darwin (1809-1882), The Expression of the Emotions in Man and Animals, published in 1872. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  7. Strategic energy planning within local authorities in the UK: A study of the city of Leeds

    International Nuclear Information System (INIS)

    Bale, Catherine S.E.; Foxon, Timothy J.; Hannon, Matthew J.; Gale, William F.

    2012-01-01

    This paper considers the development of a strategic energy body in a local authority in the UK and looks at the perceived need for, and possible roles of, such a body. Historically, energy provision and management has not usually been a strategic priority for UK local authorities. Yet energy considerations are implicit in key local authority responsibilities such as transport, waste management, planning, and the provision of housing services. In addition, recent UK central government policies support the move to localism and provide incentives for low-carbon energy generation. A study was undertaken to assess the potential (including both the perceived benefits and actual capacity to deliver) for Leeds City Council to develop a strategic body to execute delivery of city-level energy decision-making. We examine the perceived benefits to a range of main stakeholders, using data drawn from interviews with managers responsible for low-carbon and renewable energy projects across the city. Through participant observation we explore the capacity of a local authority to deliver a strategic energy body, and we briefly examine the possible forms of delivery. We conclude with recommendations for national policy that would enable the development of strategic energy bodies across local governments in the UK. - Highlights: ► Strategic energy planning is currently not a priority for UK local authorities. ► We present an empirical study of strategic energy planning in local authorities. ► Results from stakeholder interviews suggest support for a strategic energy body. ► We identify the capacity barriers to implementing a strategic energy body. ► We make recommendations for ways forward and support needed from national policy.

  8. Assessing and Developing the Application of LEED Green Building Rating System as a Sustainable Project Management and Market Tool in the Italian Context

    Directory of Open Access Journals (Sweden)

    Walaa S. E. Ismaee

    2016-07-01

    Full Text Available The paper discusses the recent introduction of the LEED system to the Italian context in order to assess its role to promote sustainable building process in the Italian context, pointing out its potentials on one hand as well as their gaps and limitations on the other hand, and suggests means for its future development. The study discusses the application of LEED as a ‘Sustainable Project management tool’ to guide sustainable building performance. This requires investigating the following: its structure, tools, assessment criteria along with its benchmarks and references. It also discusses the application of LEED as a ‘Sustainable building Certification and market tool’. This investigates the role and value of the LEED certification in the Italian Green market. The research method is comprised of three parts. The first part is a comparative analysis of LEED categories against Italian national initiatives for sustainability. The comparison showed that most LEED categories are already mandated by national norms and directives but they may differ in their stringency creating some areas of precedence of LEED system or drawbacks. This streamlines the adaptation process of LEED system to the Italian context. The second part investigates LEED projects’ market analysis. The result showed that the shift towards a sustainable building process is occurring slowly and on a vertical scale focusing on some building sectors rather than others. Its market diffusion in the Italian context faces challenges regarding the insufficient availability of green materials and products satisfying its requirements, as well as high soft cost of sustainability tests and expertise required. The Third part presents a practical review-citing the methodology and results of a survey conducted by the researchers in mid-2012. It is composed of a web-based questionnaire and interviews among a sample of LEED professionals in Italy. The result shows that LEED systems needs

  9. A study of the National Physical Laboratory microdosimetry research programme in collaboration with the University of Leeds

    International Nuclear Information System (INIS)

    Menzel, H.G.

    1987-11-01

    A study of the present programme of work carried out by the National Physical Laboratory and the University of Leeds, has been carried out. The study is based on the use of the tissue-equivalent proportional counter in microdosimetic techniques in radiation protection for monoenergetic neutrons or reference radionuclide neutron sources. This report comments on the programme as a whole and provides recommendations for future research work, taking into account the research programmes carried out at other institutions. It also attempts to summarise the present state of knowledge and experience associated with the application of this technique to radiation fields met in routine radiation protection. (author)

  10. The use of Leeds Test Objects in the assessment of the performance of radiological imaging systems: an introduction

    International Nuclear Information System (INIS)

    Cowen, A.R.

    1986-01-01

    Over the preceding decade the Leeds Radiological Imaging Group have developed a range of test objects with which to assess the performance of radiological imaging systems. The types of imaging equipment which can be assessed include X-ray image intensifier television systems, small-format 100mm/105mm fluorography systems and radiographic screen-film combinations. We have recently extended our interest to the evaluation of digital radiological imaging equipment including digital subtraction fluorography and digital (greyscale) radiographic imaging systems. These test objects were initially developed for the purpose of evaluating imaging performance under laboratory conditions but they have also proved useful under field (clinical) conditions. (author)

  11. Do we need statistics when we have linguistics?

    Directory of Open Access Journals (Sweden)

    Cantos Gómez Pascual

    2002-01-01

    Full Text Available Statistics is known to be a quantitative approach to research. However, most of the research done in the fields of language and linguistics is of a different kind, namely qualitative. Succinctly, qualitative analysis differs from quantitative analysis is that in the former no attempt is made to assign frequencies, percentages and the like, to the linguistic features found or identified in the data. In quantitative research, linguistic features are classified and counted, and even more complex statistical models are constructed in order to explain these observed facts. In qualitative research, however, we use the data only for identifying and describing features of language usage and for providing real occurrences/examples of particular phenomena. In this paper, we shall try to show how quantitative methods and statistical techniques can supplement qualitative analyses of language. We shall attempt to present some mathematical and statistical properties of natural languages, and introduce some of the quantitative methods which are of the most value in working empirically with texts and corpora, illustrating the various issues with numerous examples and moving from the most basic descriptive techniques (frequency counts and percentages to decision-taking techniques (chi-square and z-score and to more sophisticated statistical language models (Type-Token/Lemma-Token/Lemma-Type formulae, cluster analysis and discriminant function analysis.

  12. Statistical analysis of natural radiation levels inside the UNICAMP campus through the use of Geiger-Muller counter; Analise estatistica dos niveis de radiacao natural dentro da UNICAMP atraves do uso de contador Geiger-Muller

    Energy Technology Data Exchange (ETDEWEB)

    Fontolan, Juliana A.; Biral, Antonio Renato P., E-mail: fontolanjuliana@gmail.com.br, E-mail: biral@ceb.unicamp.br [Hospital das Clinicas (CEB/UNICAMP), Campinas, SP (Brazil). Centro de Engenharia Biomedica

    2013-07-01

    It is known that the distribution at time intervals of random and unrelated events leads to the Poisson distribution . This work aims to study the distribution in time intervals of events resulting from radioactive decay of atoms present in the UNICAMP where activities involving the use of ionizing radiation are performed environments . The proposal is that the distribution surveys at intervals of these events in different locations of the university are carried out through the use of a Geiger-Mueller tube . In a next step , the evaluation of distributions obtained by using non- parametric statistics (Chi- square and Kolmogorov Smirnoff) will be taken . For analyzes involving correlations we intend to use the ANOVA (Analysis of Variance) statistical tool . Measured in six different places within the Campinas , with the use of Geiger- Muller its count mode and a time window of 20 seconds was performed . Through statistical tools chi- square and Kolmogorov Smirnoff tests, using the EXCEL program , it was observed that the distributions actually refer to a Poisson distribution. Finally, the next step is to perform analyzes involving correlations using the statistical tool ANOVA.

  13. ADOÇÃO DA CERTIFICAÇÃO LEED EM MEIOS DE HOSPEDAGEM: ESVERDEANDO A HOTELARIA?

    Directory of Open Access Journals (Sweden)

    Mirna de Lima Medeiros

    2012-03-01

    Full Text Available The research intended to analyze the adoption process of the green certification “Leadership in Energy and Environmental Design” (LEED from the hotel sector establishments that has already adopted it. For its concretization it was proceeded a bibliographical research, secondary fact-gathering in journals, institutional sites and documentaries, and primary fact-gathering by means of semi structured interviews carried out with responsible people of the certified hotels and of the responsible entity of the certification in Brazil (Green Building Council Brazil. There were 21 interviewee, being 02 of the GBC Brazil and 19 of means of lodging (31% of the certified. For data analysis, it was utilized content analysis technique with the aid of ATLAS.ti software. The results permitted to identify the chronology of the processes of certification and the profile of the hotel categories that adopt the LEED program. Beyond that, the interviews enabled the discussion of the initial motivations for seeking the certification, as well the advantages and the obstacles perceived regarding its adoption.

  14. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    method provides a simple goodness of t test by comparing the observed SFS with the expected SFS under a given model of population size changes. By the use of Monte Carlo estimation the expected time between coalescent events can be estimated and the expected SFS can thereby be evaluated. Using......). The OR is interpreted as the eect of an exposure on the probability of being diseased at the end of follow-up, while the interpretation of the IRR is the eect of an exposure on the probability of becoming diseased. Through a simulation study, the OR from a classical case-control study is shown to be an inconsistent...... the classical chi-square statistics we are able to infer single parameter models. Multiple parameter models, e.g. multiple epochs, are harder to identify. By introducing the inference of population size back in time as an inverse problem, the second procedure applies the theory of smoothing splines to infer...

  15. Statistical hypothesis tests of some micrometeorological observations

    International Nuclear Information System (INIS)

    SethuRaman, S.; Tichler, J.

    1977-01-01

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g 1 has a good correlation with the chi-square values. Events with vertical-barg 1 vertical-bar 1 vertical-bar<0.43 were approximately normal. Intermittency associated with the formation and breaking of internal gravity waves in surface-based inversions over water is thought to be the reason for the non-normality

  16. Gaussian statistics of the cosmic microwave background: Correlation of temperature extrema in the COBE DMR two-year sky maps

    Science.gov (United States)

    Kogut, A.; Banday, A. J.; Bennett, C. L.; Hinshaw, G.; Lubin, P. M.; Smoot, G. F.

    1995-01-01

    We use the two-point correlation function of the extrema points (peaks and valleys) in the Cosmic Background Explorer (COBE) Differential Microwave Radiometers (DMR) 2 year sky maps as a test for non-Gaussian temperature distribution in the cosmic microwave background anisotropy. A maximum-likelihood analysis compares the DMR data to n = 1 toy models whose random-phase spherical harmonic components a(sub lm) are drawn from either Gaussian, chi-square, or log-normal parent populations. The likelihood of the 53 GHz (A+B)/2 data is greatest for the exact Gaussian model. There is less than 10% chance that the non-Gaussian models tested describe the DMR data, limited primarily by type II errors in the statistical inference. The extrema correlation function is a stronger test for this class of non-Gaussian models than topological statistics such as the genus.

  17. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  18. Effects of Contract Delivery Method on the LEED(trademark) Score of U.S. Navy Military Construction Projects (Fiscal Years 2004-2006) (CD-ROM)

    National Research Council Canada - National Science Library

    Carpenter, Deanna S

    2005-01-01

    ...: 1 CD-ROM; 4 3/4 in.; 484 KB. ABSTRACT: This research study focused on determining the effects that the two major contract delivery methods had on the LEED score of projects over the design and construction time horizon...

  19. Surface structure and electronic states of epitaxial β-FeSi.sub.2./sub.(100)/Si(001) thin films: Combined quantitative LEED, ab initio DFT, and STM study

    Czech Academy of Sciences Publication Activity Database

    Romanyuk, Olexandr; Hattori, K.; Someta, M.; Daimon, H.

    2014-01-01

    Roč. 90, č. 15 (2014), "155305-1"-"155305-9" ISSN 1098-0121 Grant - others:AVČR(CZ) M100101201; Murata Science Foundation(JP) Project n. 00295 Institutional support: RVO:68378271 Keywords : iron silicide * LEED I-V * DFT * STM * surface reconstruction * surface states Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.736, year: 2014

  20. Analysis of Chi-square Automatic Interaction Detection (CHAID) and Classification and Regression Tree (CRT) for Classification of Corn Production

    Science.gov (United States)

    Susanti, Yuliana; Zukhronah, Etik; Pratiwi, Hasih; Respatiwulan; Sri Sulistijowati, H.

    2017-11-01

    To achieve food resilience in Indonesia, food diversification by exploring potentials of local food is required. Corn is one of alternating staple food of Javanese society. For that reason, corn production needs to be improved by considering the influencing factors. CHAID and CRT are methods of data mining which can be used to classify the influencing variables. The present study seeks to dig up information on the potentials of local food availability of corn in regencies and cities in Java Island. CHAID analysis yields four classifications with accuracy of 78.8%, while CRT analysis yields seven classifications with accuracy of 79.6%.

  1. On the Existence of Uniformly Most Powerful Bayesian Tests With Application to Non-Central Chi-Squared Tests

    OpenAIRE

    Nikooienejad, Amir; Johnson, Valen E.

    2018-01-01

    Uniformly most powerful Bayesian tests (UMPBTs) are an objective class of Bayesian hypothesis tests that can be considered the Bayesian counterpart of classical uniformly most powerful tests. Unfortunately, UMPBTs have only been exposed for application in one parameter exponential family models. The purpose of this article is to describe methodology for deriving UMPBTs for a larger class of tests. Specifically, we introduce sufficient conditions for the existence of UMPBTs and propose a unifi...

  2. Chi-square spectral fitting for concentration retrieval, automatic local calibration, quality control, and water type detection

    NARCIS (Netherlands)

    Hommersom, A.; Peters, S.W.M.; van der Woerd, H.J.; Eleveld, M.A.; de Boer, J.

    2011-01-01

    In this study, the inverse bio-optical model HYDROPT was calibrated with regional specific inherent optical properties (SIOPs) and various local SIOPs to examine the effect of these calibrations on the retrievals. The study area, the Wadden Sea, is an estuary and tidal flat area with very high

  3. Producing 'internal suspect bodies': divisive effects of UK counter-terrorism measures on Muslim communities in Leeds and Bradford.

    Science.gov (United States)

    Abbas, Madeline-Sophie

    2018-04-06

    Research on UK government counter-terrorism measures has claimed that Muslims are treated as a 'suspect community'. However, there is limited research exploring the divisive effects that membership of a 'suspect community' has on relations within Muslim communities. Drawing from interviews with British Muslims living in Leeds or Bradford, I address this gap by explicating how co-option of Muslim community members to counter extremism fractures relations within Muslim communities. I reveal how community members internalize fears of state targeting which precipitates internal disciplinary measures. I contribute the category of 'internal suspect body' which is materialized through two intersecting conditions within preventative counter-terrorism: the suspected extremist for Muslims to look out for and suspected informer who might report fellow Muslims. I argue that the suspect community operates through a network of relations by which terrors of counter-terrorism are reproduced within Muslim communities with divisive effects. © London School of Economics and Political Science 2018.

  4. Challenging Racist Violence and Racist Hostility in 'Post-Racial' Times: Research and Action in Leeds, UK, 2006–2012

    Directory of Open Access Journals (Sweden)

    Ian Law

    2013-04-01

    Full Text Available Despite increasing understanding of, information about and official commitment to challenge these patterns, racist hostility and violence continue to have an enduring presence in urban and rural life in the UK. This indicates the paradoxical nature of this racial crisis and challenges for antiracism as a political project. This paper charts how these issues play out at the local level through an examination of a five year process from problem identification through to research, response, action and aftermath from 2006 to 2012 in the city of Leeds, UK, with a focus on two predominantly white working class social housing estates in the city. We explore how embedded tensions and antagonisms can begin to be challenged, while examining how the contemporary climate of austerity and cuts in services, together with prevailing post-racial thinking, make the likelihood of such concerted action in the UK increasingly remote.

  5. Energy Provisions of the ICC-700, LEED for Homes, and ENERGY STAR Mapped to the 2009 IECC

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Michelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sullivan, Robin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kora, Angela R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Makela, Eric J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Makela, Erin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2011-05-01

    This document provides the results of a comparison of building energy efficient elements of the ICC-700 National Green Building Standard, LEED for Homes, and ENERGY STAR versions 2, 2.5, and 3.0 to the 2009 International Energy Conservation Code (2009 IECC). This comparison will provide a tool for states and local municipalities as they consider adoption of these programs. The comparison is presented in a series of appendices. The first appendix provides a summary chart that visually represents the comprehensive comparison of the programs to the 2009 IECC topic areas. Next there are a series of individual tables (one appendix for each program) that include the specific program mapping to the 2009 IECC elements with comments that briefly discuss how well the elements mapped. Finally, a comprehensive table is included that shows all five of the programs mapped to the 2009 IECC elements to allow a detailed comparison.

  6. Life Cycle Assessment and Optimization-Based Decision Analysis of Construction Waste Recycling for a LEED-Certified University Building

    Directory of Open Access Journals (Sweden)

    Murat Kucukvar

    2016-01-01

    Full Text Available The current waste management literature lacks a comprehensive LCA of the recycling of construction materials that considers both process and supply chain-related impacts as a whole. Furthermore, an optimization-based decision support framework has not been also addressed in any work, which provides a quantifiable understanding about the potential savings and implications associated with recycling of construction materials from a life cycle perspective. The aim of this research is to present a multi-criteria optimization model, which is developed to propose economically-sound and environmentally-benign construction waste management strategies for a LEED-certified university building. First, an economic input-output-based hybrid life cycle assessment model is built to quantify the total environmental impacts of various waste management options: recycling, conventional landfilling and incineration. After quantifying the net environmental pressures associated with these waste treatment alternatives, a compromise programming model is utilized to determine the optimal recycling strategy considering environmental and economic impacts, simultaneously. The analysis results show that recycling of ferrous and non-ferrous metals significantly contributed to reductions in the total carbon footprint of waste management. On the other hand, recycling of asphalt and concrete increased the overall carbon footprint due to high fuel consumption and emissions during the crushing process. Based on the multi-criteria optimization results, 100% recycling of ferrous and non-ferrous metals, cardboard, plastic and glass is suggested to maximize the environmental and economic savings, simultaneously. We believe that the results of this research will facilitate better decision making in treating construction and debris waste for LEED-certified green buildings by combining the results of environmental LCA with multi-objective optimization modeling.

  7. Potential errors and misuse of statistics in studies on leakage in endodontics.

    Science.gov (United States)

    Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J

    2013-04-01

    To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.

  8. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  9. Statistical Analysis And Treatment Of Accident Black Spots: A Case Study Of Nandyal Mandal

    Science.gov (United States)

    Sudharshan Reddy, B.; Vishnu Vardhan Reddy, L.; Sreenivasa Reddy, G., Dr

    2017-08-01

    Background: Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. Nandyal Mandal is located in the Kurnool district of Andhra Pradesh and well developed in both agricultural and industrial sectors after Kurnool. 567 accidents occurred in the last seven years at 143 locations shows the severity of the accidents in the Nandyal Mandal. There is a need to carry out some work in the Nandyal Mandal to improve the accidents black spots for reducing the accidents. Methods: Last seven years (2010-2016) of accident data collected from Police Stations. Weighted Severity Index (WSI), a scientific method is used for identifying the accident black spots. Statistical analysis has carried out for the collected data using Chi-Square Test to determine the independence of accidents with other attributes. Chi-Square Goodness of fit test conducted for test whether the accidents are occurring by chance or following any pattern. Results: WSI values are determined for the 143 locations. The Locations with high WSI are treated as accident black spots. Five black spots are taken for field study. After field observations and interaction with the public, some improvements are suggested for improving the accident black spots. There is no relationship between the severity of accidents and the other attributes like month, season, day, hours in day and the age group except type of vehicle. Road accidents are distributed throughout the Year, Month and Season. Road accidents are not distributed throughout the day.

  10. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  11. Polarizing a stored proton beam by spin flip? - A high statistic reanalysis

    International Nuclear Information System (INIS)

    Oellers, Dieter

    2011-01-01

    Prompted by recent, conflicting calculations, we have carried out a measurement of the spin flip cross section in low-energy electron-proton scattering. The experiment uses the cooling electron beam at COSY as an electron target. A reanalysis of the data leeds to a reduced statistical errors resulting in a factor of 4 reduced upper limit for the spin flip cross section. The measured cross sections are too small for making spin flip a viable tool in polarizing a stored beam.

  12. LEED AND THE DESIGN/BUILD EXPERIENCE: A SHELTER FOR HOMELESS FAMILIES RETURNING TO POST-KATRINA NEW ORLEANS.

    Directory of Open Access Journals (Sweden)

    Stephen Verderber

    2011-03-01

    Full Text Available Hurricane Katrina displaced nearly one million citizens from the New Orleans metro region in 2005. Five years after the catastrophe, in August of 2010, more than 150,000 citizens remained scattered across the United States. Katrina was the largest Diaspora in the nation’s history. The number of homes damaged or destroyed by Katrina’s devastation numbered more than 125,000. An award-winning case study is presented of a unique partnership forged between academia, a local social service agency, professional architectural and engineering firms, and a national humanitarian aid organization whose mission is to provide affordable housing for homeless persons in transition. This collaboration resulted in a sustainable design/build project that originated in a research-based university design studio. The facility is a 38-bed family shelter for homeless mothers and their children seeking to rebuild their lives in post-Katrina New Orleans. The site for this 4,400 facility did not flood when the city’s federally built levee system failed in 2005. This case study is presented from its inception, to programming and design, construction, occupancy, and the postoccupancy assessment of the completed building. This facility is the first LEED certified (Silver building in New Orleans. Project limitations, lessons learned, and recommendations for future initiatives of this type are discussed, particularly in the context of any inner urban community coping with the aftermath of an urban disaster.

  13. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    Science.gov (United States)

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  14. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  15. Statistical Signal Process in R Language in the Pharmacovigilance Programme of India.

    Science.gov (United States)

    Kumar, Aman; Ahuja, Jitin; Shrivastava, Tarani Prakash; Kumar, Vipin; Kalaiselvan, Vivekanandan

    2018-05-01

    The Ministry of Health & Family Welfare, Government of India, initiated the Pharmacovigilance Programme of India (PvPI) in July 2010. The purpose of the PvPI is to collect data on adverse reactions due to medications, analyze it, and use the reference to recommend informed regulatory intervention, besides communicating the risk to health care professionals and the public. The goal of the present study was to apply statistical tools to find the relationship between drugs and ADRs for signal detection by R programming. Four statistical parameters were proposed for quantitative signal detection. These 4 parameters are IC 025 , PRR and PRR lb , chi-square, and N 11 ; we calculated these 4 values using R programming. We analyzed 78,983 drug-ADR combinations, and the total count of drug-ADR combination was 4,20,060. During the calculation of the statistical parameter, we use 3 variables: (1) N 11 (number of counts), (2) N 1. (Drug margin), and (3) N .1 (ADR margin). The structure and calculation of these 4 statistical parameters in R language are easily understandable. On the basis of the IC value (IC value >0), out of the 78,983 drug-ADR combination (drug-ADR combination), we found the 8,667 combinations to be significantly associated. The calculation of statistical parameters in R language is time saving and allows to easily identify new signals in the Indian ICSR (Individual Case Safety Reports) database.

  16. Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.

    Science.gov (United States)

    Chalmers, R Philip

    2018-06-01

    This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.

  17. A shape-based statistical method to retrieve 2D TRUS-MR slice correspondence for prostate biopsy

    Science.gov (United States)

    Mitra, Jhimli; Srikantha, Abhilash; Sidibé, Désiré; Martí, Robert; Oliver, Arnau; Lladó, Xavier; Ghose, Soumya; Vilanova, Joan C.; Comet, Josep; Meriaudeau, Fabrice

    2012-02-01

    This paper presents a method based on shape-context and statistical measures to match interventional 2D Trans Rectal Ultrasound (TRUS) slice during prostate biopsy to a 2D Magnetic Resonance (MR) slice of a pre-acquired prostate volume. Accurate biopsy tissue sampling requires translation of the MR slice information on the TRUS guided biopsy slice. However, this translation or fusion requires the knowledge of the spatial position of the TRUS slice and this is only possible with the use of an electro-magnetic (EM) tracker attached to the TRUS probe. Since, the use of EM tracker is not common in clinical practice and 3D TRUS is not used during biopsy, we propose to perform an analysis based on shape and information theory to reach close enough to the actual MR slice as validated by experts. The Bhattacharyya distance is used to find point correspondences between shape-context representations of the prostate contours. Thereafter, Chi-square distance is used to find out those MR slices where the prostates closely match with that of the TRUS slice. Normalized Mutual Information (NMI) values of the TRUS slice with each of the axial MR slices are computed after rigid alignment and consecutively a strategic elimination based on a set of rules between the Chi-square distances and the NMI leads to the required MR slice. We validated our method for TRUS axial slices of 15 patients, of which 11 results matched at least one experts validation and the remaining 4 are at most one slice away from the expert validations.

  18. Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain

    Science.gov (United States)

    Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young

    2010-01-01

    Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071

  19. Sokol Blosser Barrel Aging Cellar : green roofs and LEED{sup TM} buildings in the rural context

    Energy Technology Data Exchange (ETDEWEB)

    Cravens, L.L. [Sera Architects Inc., Portland, OR (United States)

    2004-07-01

    An earth covered structure that stores 900 barrels of wine at the Sokol Blosser Winery located in Yamhill Valley, southeast of Portland, Oregon was presented. The owner's decision to build as sustainably as possible when constructing the barrel aging cellar was reinforced by their involvement in the Oregon Natural Step Network, a non-profit organization that promotes sustainability principles in any endeavor. The sustainable project design solution led by SERA Architects met the winery's requirements for an underground structure capable of storing 900 barrels of wine in three chambers; natural daylight throughout; control over the temperature and humidity; natural ventilation; the use of sustainable materials, and minimal materials; use of local products; preserving the maximum existing open area; and, minimizing construction demolition and waste. The Leadership in Energy and Environmental Design (LEED) criteria for the green building rating system was used to measure the green construction practices. The many benefits of burying the building were identified, namely the cooling system was eliminated, which reduced the cost of the mechanical system, reduced the major draw for energy, and eliminated any use of ozone depleting refrigerants. The roof's waterproofing system was provided by Tremco. Combined with a non-engineered earth cover the manufacturer provided a warranty of 20 years but predicted a 60 year life for the roof. The Roof sandwich structure from top down was described in detail and illustrations were presented. The final calculations indicate a $750 annual energy savings above a traditional space. 6 figs.

  20. South Asians are Under-Represented in a Clinic Treating Atrial Fibrillation in a Multicultural City in the UK.

    Science.gov (United States)

    Tayebjee, M H; Tyndall, K; Holding, S; Russell, C; Graham, L N; Pepper, C B

    2012-01-01

    The Leeds rapid access atrial fibrillation (AF) clinic was set up to streamline and standardise management of patients with newly diagnosed AF. Anecdotal evidence suggests that there is under-representation of south Asians in these clinics.All patient attendances between June 2007 and June 2011 were documented and combined with ethnicity data from patient administration records. Local population demographics for 2009 were obtained from the office of national statistics. This was used to estimate the expected prevalence of AF across the different ethnic groups in Leeds taking age into account. One thousand two hundred and ten patients were referred. The study sample included 992 patients, and the number of south Asians attending was 88% less than expected (Chi squared analysis; pcosmopolitan population. Potential reasons for this discrepancy including barriers to accessing treatment for this population or a lower prevalence of AF in south Asians due to an as yet unidentified genetic factor.

  1. Statistical modeling of road contribution as emission sources to total suspended particles (TSP) under MCF model downtown Medellin - Antioquia - Colombia, 2004

    International Nuclear Information System (INIS)

    Gomez, Miryam; Saldarriaga, Julio; Correa, Mauricio; Posada, Enrique; Castrillon M, Francisco Javier

    2007-01-01

    Sand fields, constructions, carbon boilers, roads, and biologic sources are air-contaminant-constituent factors in down town Valle de Aburra, among others. the distribution of road contribution data to total suspended particles according to the source receptor model MCF, source correlation modeling, is nearly a gamma distribution. Chi-square goodness of fit is used to model statistically. This test for goodness of fit also allows estimating the parameters of the distribution utilizing maximum likelihood method. As convergence criteria, the estimation maximization algorithm is used. The mean of road contribution data to total suspended particles according to the source receptor model MCF, is straightforward and validates the road contribution factor to the atmospheric pollution of the zone under study

  2. STATISTICS, Program System for Statistical Analysis of Experimental Data

    International Nuclear Information System (INIS)

    Helmreich, F.

    1991-01-01

    1 - Description of problem or function: The package is composed of 83 routines, the most important of which are the following: BINDTR: Binomial distribution; HYPDTR: Hypergeometric distribution; POIDTR: Poisson distribution; GAMDTR: Gamma distribution; BETADTR: Beta-1 and Beta-2 distributions; NORDTR: Normal distribution; CHIDTR: Chi-square distribution; STUDTR : Distribution of 'Student's T'; FISDTR: Distribution of F; EXPDTR: Exponential distribution; WEIDTR: Weibull distribution; FRAKTIL: Calculation of the fractiles of the normal, chi-square, Student's, and F distributions; VARVGL: Test for equality of variance for several sample observations; ANPAST: Kolmogorov-Smirnov test and chi-square test of goodness of fit; MULIRE: Multiple linear regression analysis for a dependent variable and a set of independent variables; STPRG: Performs a stepwise multiple linear regression analysis for a dependent variable and a set of independent variables. At each step, the variable entered into the regression equation is the one which has the greatest amount of variance between it and the dependent variable. Any independent variable can be forced into or deleted from the regression equation, irrespective of its contribution to the equation. LTEST: Tests the hypotheses of linearity of the data. SPRANK: Calculates the Spearman rank correlation coefficient. 2 - Method of solution: VARVGL: The Bartlett's Test, the Cochran's Test and the Hartley's Test are performed in the program. MULIRE: The Gauss-Jordan method is used in the solution of the normal equations. STPRG: The abbreviated Doolittle method is used to (1) determine variables to enter into the regression, and (2) complete regression coefficient calculation. 3 - Restrictions on the complexity of the problem: VARVGL: The Hartley's Test is only performed if the sample observations are all of the same size

  3. WATER POLO GAME-RELATED STATISTICS IN WOMEN'S INTERNATIONAL CHAMPIONSHIPS: DIFFERENCES AND DISCRIMINATORY POWER

    Directory of Open Access Journals (Sweden)

    Yolanda Escalante

    2012-09-01

    Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots

  4. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  5. IMPLEMENTATION AND VALIDATION OF STATISTICAL TESTS IN RESEARCH'S SOFTWARE HELPING DATA COLLECTION AND PROTOCOLS ANALYSIS IN SURGERY.

    Science.gov (United States)

    Kuretzki, Carlos Henrique; Campos, Antônio Carlos Ligocki; Malafaia, Osvaldo; Soares, Sandramara Scandelari Kusano de Paula; Tenório, Sérgio Bernardo; Timi, Jorge Rufino Ribas

    2016-03-01

    The use of information technology is often applied in healthcare. With regard to scientific research, the SINPE(c) - Integrated Electronic Protocols was created as a tool to support researchers, offering clinical data standardization. By the time, SINPE(c) lacked statistical tests obtained by automatic analysis. Add to SINPE(c) features for automatic realization of the main statistical methods used in medicine . The study was divided into four topics: check the interest of users towards the implementation of the tests; search the frequency of their use in health care; carry out the implementation; and validate the results with researchers and their protocols. It was applied in a group of users of this software in their thesis in the strict sensu master and doctorate degrees in one postgraduate program in surgery. To assess the reliability of the statistics was compared the data obtained both automatically by SINPE(c) as manually held by a professional in statistics with experience with this type of study. There was concern for the use of automatic statistical tests, with good acceptance. The chi-square, Mann-Whitney, Fisher and t-Student were considered as tests frequently used by participants in medical studies. These methods have been implemented and thereafter approved as expected. The incorporation of the automatic SINPE (c) Statistical Analysis was shown to be reliable and equal to the manually done, validating its use as a research tool for medical research.

  6. A cross sectional study investigating the association between exposure to food outlets and childhood obesity in Leeds, UK.

    Science.gov (United States)

    Griffiths, Claire; Frearson, Anna; Taylor, Adam; Radley, Duncan; Cooke, Carlton

    2014-12-06

    Current UK policy in relation to the influence of the 'food environment' on childhood obesity appears to be driven largely on assumptions or speculations because empirical evidence is lacking and findings from studies are inconsistent. The aim of this study was to investigate the number of food outlets and the proximity of food outlets in the same sample of children, without solely focusing on fast food. Cross sectional study over 3 years (n = 13,291 data aggregated). Body mass index (BMI) was calculated for each participant, overweight and obesity were defined as having a BMI >85(th) (sBMI 1.04) and 95(th) (sBMI 1.64) percentiles respectively (UK90 growth charts). Home and school neighbourhoods were defined as circular buffers with a 2 km Euclidean radius, centred on these locations. Commuting routes were calculated using the shortest straight line distance, with a 2 km buffer to capture varying routes. Data on food outlet locations was sourced from Leeds City Council covering the study area and mapped against postcode. Food outlets were categorised into three groups, supermarkets, takeaway and retail. Proximity to the nearest food outlet in the home and school environmental domain was also investigated. Age, gender, ethnicity and deprivation (IDACI) were included as covariates in all models. There is no evidence of an association between the number of food outlets and childhood obesity in any of these environments; Home Q4 vs. Q1 OR = 1.11 (95% CI = 0.95-1.30); School Q4 vs. Q1 OR = 1.00 (95% CI 0.87 - 1.16); commute Q4 vs. Q1 OR = 0.1.00 (95% CI 0.83 - 1.20). Similarly there is no evidence of an association between the proximity to the nearest food outlet and childhood obesity in the home (OR = 0.77 [95% CI = 0.61 - 0.98]) or the school (OR = 1.01 [95% CI 0.84 - 1.23]) environment. This study provides little support for the notion that exposure to food outlets in the home, school and commuting neighbourhoods increase the risk of obesity

  7. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

  8. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Directory of Open Access Journals (Sweden)

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  9. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  10. ''House in park'' gold plated. Subsequent certification for the new office building; ''Haus im Park'' vergoldet. Nachtraegliche LEED-Zertifizierung fuer neues Buerogebaeude

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, Kati; Wildhack, Alice [Bilfinger Berger AG, Mannheim (Germany). Abt. Nachhaltigkeit/Energieeffizienz

    2011-07-01

    In March 2011, the ''house in park'' received the signet LEED {sup registered} of the U.S. Green Building Council (Washington, North Carolina, U.S.A.) in ''gold''. In general, it is the second German project that was awarded with ''gold'' in the certification version ''New Construction, version 2009''. The special challenge: In the planning phase, no aspects relevant to planning and execution with regard to LEED were implemented. In total there are 15 LEED-certified buildings actually in Germany (worldwide: 7,894). 115 Projects are registered (worldwide: 23,238). The trend is increasing, both nationally as well as internationally.

  11. To Be or Not to Be Associated: Power study of four statistical modeling approaches to identify parasite associations in cross-sectional studies

    Directory of Open Access Journals (Sweden)

    Elise eVaumourin

    2014-05-01

    Full Text Available A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e. the generalized chi-square, the network and the multinomial GLM approaches to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: 1 rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and 2 bovine population infected with Theileria sp. and Babesia sp.. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unravelling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions

  12. Realities of and perspectives for languages in the globalised world: Can language teaching survive the inadequacies of policies implemented today at Leeds Beckett University?

    Directory of Open Access Journals (Sweden)

    Saadia Gamir

    2017-06-01

    Full Text Available Various newspaper articles report that British ministers, university representatives, exam chiefs and business bodies agree that foreign languages skills in primary, secondary and tertiary UK education are in crisis. Lower funding and policy changes have caused language skills deficiencies felt gravely in the business sectors. Funding and support initiatives pledged by policy makers appear to be election-driven, barely outliving newly elected governments. Others blame secondary school language curriculum for failing to inspire students to take up a language when they reach 13 or 14. Others still argue that severe A-level examinations marking deters students from taking up a foreign language at 6th form level, producing fewer prospective language learners for university departments. Community languages are also undervalued as small-entry languages could soon be axed from GCSE and A-level examinations. In a world increasingly interconnected, it is essential the importance of language learning be reinstated in all our educational institutions. This paper reviews two decades of the conditions of language provision in the UK in general, with an emphasis on Leeds Beckett University. It also attempts to answer two questions emerging form the author’s personal teaching experience and reflections: What are the realities and challenges language teaching faces at Leeds Beckett University? And, how may we support language learners in fulfilling their ambition to acquire the required skills to communicate effectively in this globalised world?

  13. Establishing daily quality control (QC) in screen-film mammography using leeds tor (max) phantom at the breast imaging unit of USTH-Benavides Cancer Institute

    Science.gov (United States)

    Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.

    2016-03-01

    Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.

  14. To be certain about the uncertainty: Bayesian statistics for 13 C metabolic flux analysis.

    Science.gov (United States)

    Theorell, Axel; Leweke, Samuel; Wiechert, Wolfgang; Nöh, Katharina

    2017-11-01

    13 C Metabolic Fluxes Analysis ( 13 C MFA) remains to be the most powerful approach to determine intracellular metabolic reaction rates. Decisions on strain engineering and experimentation heavily rely upon the certainty with which these fluxes are estimated. For uncertainty quantification, the vast majority of 13 C MFA studies relies on confidence intervals from the paradigm of Frequentist statistics. However, it is well known that the confidence intervals for a given experimental outcome are not uniquely defined. As a result, confidence intervals produced by different methods can be different, but nevertheless equally valid. This is of high relevance to 13 C MFA, since practitioners regularly use three different approximate approaches for calculating confidence intervals. By means of a computational study with a realistic model of the central carbon metabolism of E. coli, we provide strong evidence that confidence intervals used in the field depend strongly on the technique with which they were calculated and, thus, their use leads to misinterpretation of the flux uncertainty. In order to provide a better alternative to confidence intervals in 13 C MFA, we demonstrate that credible intervals from the paradigm of Bayesian statistics give more reliable flux uncertainty quantifications which can be readily computed with high accuracy using Markov chain Monte Carlo. In addition, the widely applied chi-square test, as a means of testing whether the model reproduces the data, is examined closer. © 2017 Wiley Periodicals, Inc.

  15. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  16. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  17. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  18. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  19. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  20. Which statistics should tropical biologists learn?

    Directory of Open Access Journals (Sweden)

    Natalia Loaiza Velásquez

    2011-09-01

    Full Text Available Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA, Chi-Square Test, Student’s T Test, Linear Regression, Pearson’s Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon’s Diversity Index, Tukey’s Test, Cluster Analysis, Spearman’s Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements. Rev. Biol. Trop. 59 (3: 983-992. Epub 2011 September 01.Los biólogos tropicales estudian la biodiversidad más rica y amenazada del planeta, y en estos tiempos de cambio climático y mega-extinción, la necesidad de investigación de buena calidad es más acuciante que en el pasado. Sin embargo, el componente estadístico en la investigación publicada por los autores tropicales adolece a veces

  1. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  2. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  3. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  4. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  5. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  6. Development of an unbiased statistical method for the analysis of unigenic evolution

    Directory of Open Access Journals (Sweden)

    Shilton Brian H

    2006-03-01

    Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.

  7. Sample Size and Statistical Conclusions from Tests of Fit to the Rasch Model According to the Rasch Unidimensional Measurement Model (Rumm) Program in Health Outcome Measurement.

    Science.gov (United States)

    Hagell, Peter; Westergren, Albert

    Sample size is a major factor in statistical null hypothesis testing, which is the basis for many approaches to testing Rasch model fit. Few sample size recommendations for testing fit to the Rasch model concern the Rasch Unidimensional Measurement Models (RUMM) software, which features chi-square and ANOVA/F-ratio based fit statistics, including Bonferroni and algebraic sample size adjustments. This paper explores the occurrence of Type I errors with RUMM fit statistics, and the effects of algebraic sample size adjustments. Data with simulated Rasch model fitting 25-item dichotomous scales and sample sizes ranging from N = 50 to N = 2500 were analysed with and without algebraically adjusted sample sizes. Results suggest the occurrence of Type I errors with N less then or equal to 500, and that Bonferroni correction as well as downward algebraic sample size adjustment are useful to avoid such errors, whereas upward adjustment of smaller samples falsely signal misfit. Our observations suggest that sample sizes around N = 250 to N = 500 may provide a good balance for the statistical interpretation of the RUMM fit statistics studied here with respect to Type I errors and under the assumption of Rasch model fit within the examined frame of reference (i.e., about 25 item parameters well targeted to the sample).

  8. Differences and discriminatory power of water polo game-related statistics in men in international championships and their relationship with the phase of the competition.

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Domínguez, Ana M

    2013-04-01

    The aims of this study were (a) to compare water polo game-related statistics by context (winning and losing teams) and phase (preliminary, classification, and semifinal/bronze medal/gold medal), and (b) identify characteristics that discriminate performances for each phase. The game-related statistics of the 230 men's matches played in World Championships (2007, 2009, and 2011) and European Championships (2008 and 2010) were analyzed. Differences between contexts (winning or losing teams) in each phase (preliminary, classification, and semifinal/bronze medal/gold medal) were determined using the chi-squared statistic, also calculating the effect sizes of the differences. A discriminant analysis was then performed after the sample-splitting method according to context (winning and losing teams) in each of the 3 phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables are both offensive and defensive, including action shots, sprints, goalkeeper-blocked shots, and goalkeeper-blocked action shots. However, the number of discriminatory variables decreases as the phase becomes more demanding and the teams become more equally matched. The discriminant analysis showed the game-related statistics to discriminate performance in all phases (preliminary, classificatory, and semifinal/bronze medal/gold medal phase) with high percentages (91, 90, and 73%, respectively). Again, the model selected both defensive and offensive variables.

  9. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  10. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  11. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  12. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  13. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  14. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  15. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  16. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  17. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  18. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  19. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  20. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  1. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  2. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  3. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  4. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  5. Factors Contributing to Successful Employment Outcomes for Hispanic Women Who Are Deaf: Utilization of Chi-Squared Automatic Interaction Detector and Logistic Regression Analysis

    Science.gov (United States)

    Feist, Amber M.

    2013-01-01

    Hispanic women who are deaf constitute a heterogeneous group of individuals with varying vocational needs. To understand the unique needs of this population, it is important to analyze how consumer characteristics, presence of public supports, and type of services provided influence employment outcomes for Hispanic women who are deaf. The purpose…

  6. Statistical methods for determination of background levels for naturally occuring radionuclides in soil at a RCRA facility

    International Nuclear Information System (INIS)

    Guha, S.; Taylor, J.H.

    1996-01-01

    It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed

  7. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  8. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  9. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  10. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  11. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  12. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  13. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  14. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  15. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  16. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  17. Low energy electron diffraction (LEED) and sum frequency generation (SFG) vibrational spectroscopy studies of solid-vacuum, solid-air and solid-liquid interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Hoffer, Saskia [Univ. of California, Berkeley, CA (United States)

    2002-01-01

    Electron based surface probing techniques can provide detailed information about surface structure or chemical composition in vacuum environments. The development of new surface techniques has made possible in situ molecular level studies of solid-gas interfaces and more recently, solid-liquid interfaces. The aim of this dissertation is two-fold. First, by using novel sample preparation, Low Energy Electron Diffraction (LEED) and other traditional ultra high vacuum (UHV) techniques are shown to provide new information on the insulator/vacuum interface. The surface structure of the classic insulator NaCl has been determined using these methods. Second, using sum frequency generation (SFG) surface specific vibrational spectroscopy studies were performed on both the biopolymer/air and electrode/electrolyte interfaces. The surface structure and composition of polyetherurethane-silicone copolymers were determined in air using SFG, atomic force microscopy (AFM), and X-ray photoelectron spectroscopy (XPS). SFG studies of the electrode (platinum, gold and copper)/electrolyte interface were performed as a function of applied potential in an electrochemical cell.

  18. Environmental Assessment Methodologies for Commercial Buildings: An Elicitation Study of U.S. Building Professionals’ Beliefs on Leadership in Energy and Environmental Design (LEED

    Directory of Open Access Journals (Sweden)

    Jasmin Kientzel

    2011-12-01

    Full Text Available Voluntary environmental programs (VEPs have become increasingly popular around the world to address energy efficiency issues that mandatory building codes have not been able to tackle. Even though the utility of voluntary schemes is widely debated, they have become a de facto reality for many professionals in the building and construction sector. One topic that is neglected, however, in both academic and policy discussions, relates to how professionals (architects, engineers, real estate developers, etc. perceive the rise of voluntary rating schemes. In order to fill this gap in the literature, the present study investigates beliefs underlying adoption behavior regarding one of the most prominent voluntary assessment and certification programs in the U.S. building industry, the Leadership in Energy and Environmental Design (LEED scheme. In this paper, an elicitation study, based on 14 semi-structured interviews with building professionals in the North East of the United States, was conducted to analyze this question. Building on the Reasoned Action Approach, this paper shows that, in addition to more conventional factors such as financial calculations and marketing aspects, the understanding of beliefs held by building professionals offers important insights into their decisions to work with Voluntary Environmental Assessment and Rating Programs.

  19. Cross-cultural Adaptation and Linguistic Validation of the Korean Version of the Leeds Assessment of Neuropathic Symptoms and Signs Pain Scale

    Science.gov (United States)

    Park, Cholhee; Lee, Youn-Woo; Yoon, Duck Mi; Kim, Do Wan; Nam, Da Jeong

    2015-01-01

    Distinction between neuropathic pain and nociceptive pain helps facilitate appropriate management of pain; however, diagnosis of neuropathic pain remains a challenge. The aim of this study was to develop a Korean version of the Leeds Assessment of Neuropathic Symptoms and Signs (LANSS) pain scale and assess its reliability and validity. The translation and cross-cultural adaptation of the original LANSS pain scale into Korean was established according to the published guidelines. The Korean version of the LANSS pain scale was applied to a total of 213 patients who were expertly diagnosed with neuropathic (n = 113) or nociceptive pain (n = 100). The Korean version of the scale had good reliability (Cronbach's α coefficient = 0.815, Guttman split-half coefficient = 0.800). The area under the receiver operating characteristic curve was 0.928 with a 95% confidence interval of 0.885-0.959 (P < 0.001), suggesting good discriminate value. With a cut-off score ≥ 12, sensitivity was 72.6%, specificity was 98.0%, and the positive and negative predictive values were 98% and 76%, respectively. The Korean version of the LANSS pain scale is a useful, reliable, and valid instrument for screening neuropathic pain from nociceptive pain. PMID:26339176

  20. Cross-Cultural Psychometric Assessment of the Leeds Assessment of Neuropathic Symptoms and Signs (LANSS) Pain Scale in the Portuguese Population.

    Science.gov (United States)

    Barbosa, Margarida; Bennett, Michael I; Verissimo, Ramiro; Carvalho, Davide

    2014-09-01

    Chronic pain is a well-known phenomenon. The differential diagnosis between neuropathic and nociceptive pain syndromes is a challenge. Consequently, assessment instruments that can distinguish between these conditions in a standardized way are of the utmost importance. The Leeds Assessment of Neuropathic Symptoms and Signs (LANSS) is a screening tool developed to identify chronic neuropathic pain. The aim of this study was the Portuguese language translation, linguistic adaptation of the LANSS pain scale, its semantic validation, internal consistency, temporal stability, as well its validity and discriminative power. LANSS Portuguese version scale was applied to 165 consecutive patients attending the pain clinic: 103 fulfilled the clinical criteria for the diagnosis of pain of neuropathic origin and the remaining 62 fulfilled the criteria for nociceptive pain. The scale proved to be an internally consistent (Cronbach's alpha = 0.78) and reliable instrument with good test-retest stability (r = 0.7; P cross-cultural version is a reliable and valid instrument for the differentiation of this type of pain. Its usage is recommended. © 2013 World Institute of Pain.

  1. LEED-IV study of the rutile TiO2(110)-1x2 surface with a Ti-interstitial added-row reconstruction

    International Nuclear Information System (INIS)

    Blanco-Rey, M.; Mendez, J.; Lopez, M. F.; Roman, E.; Martin-Gago, J. A.; Andres, P. L. de; Abad, J.; Rogero, C.

    2007-01-01

    Upon sputtering and annealing in UHV at ∼1000 K, the rutile TiO 2 (110) surface undergoes a 1x1→1x2 phase transition. The resulting 1x2 surface is Ti rich, formed by strands of double Ti rows as seen on scanning tunneling microscopic images, but its detailed structure and composition have been subject to debate in the literature for years. Recently, Park et al. [Phys. Rev. Lett. 96, 226105 (2006)] have proposed a model where Ti atoms are located on interstitial sites with Ti 2 O stoichiometry. This model, when it is analyzed using LEED-IV data [Phys. Rev. Lett. 96, 0055502 (2006)], does not yield an agreement between theory and experiment as good as the previous best fit for Onishi and Iwasawa's model for the long-range 1x2 reconstruction. Therefore, the Ti 2 O 3 added row is the preferred one from the point of view low-energy electron diffraction

  2. Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions

    International Nuclear Information System (INIS)

    Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.

    2010-01-01

    A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.

  3. Lessons Learned From The 200 West Pump And Treatment Facility Construction Project At The US DOE Hanford Site - A Leadership For Energy And Environmental Design (LEED) Gold-Certified Facility

    International Nuclear Information System (INIS)

    Dorr, Kent A.; Ostrom, Michael J.; Freeman-Pollard, Jhivaun R.

    2012-01-01

    CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built in an accelerated manner with American Recovery and Reinvestment Act (ARRA) funds and has attained Leadership in Energy and Environmental Design (LEED) GOLD certification, which makes it the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. There were many contractual, technical, configuration management, quality, safety, and LEED challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility. This paper will present the Project and LEED accomplishments, as well as Lessons Learned by CHPRC when additional ARRA funds were used to accelerate design, procurement, construction, and commissioning of the 200 West Groundwater Pump and Treatment (2W PandT) Facility to meet DOE's mission of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012

  4. Lessons Learned From The 200 West Pump And Treatment Facility Construction Project At The US DOE Hanford Site - A Leadership For Energy And Environmental Design (LEED) Gold-Certified Facility

    Energy Technology Data Exchange (ETDEWEB)

    Dorr, Kent A. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Ostrom, Michael J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Freeman-Pollard, Jhivaun R. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2012-11-14

    CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built in an accelerated manner with American Recovery and Reinvestment Act (ARRA) funds and has attained Leadership in Energy and Environmental Design (LEED) GOLD certification, which makes it the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. There were many contractual, technical, configuration management, quality, safety, and LEED challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility. This paper will present the Project and LEED accomplishments, as well as Lessons Learned by CHPRC when additional ARRA funds were used to accelerate design, procurement, construction, and commissioning of the 200 West Groundwater Pump and Treatment (2W P&T) Facility to meet DOE's mission of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012.

  5. Efficiency analysis in the application of indicators LEED-ND, the arid zone of the north of Mexico, case of study: Parajes del Sur, Ciudad Juarez, Chihuahua; Analisis de la eficiencia en la aplicacion de indicadores LEED-ND, en la zona arida del norte de Mexico, caso de estudio: parajes del sur, Ciudad Juarez, Chihuahua

    Energy Technology Data Exchange (ETDEWEB)

    Pena Barrera, Leticia [Universidad Autonoma de Ciudad Juarez, Chihuahua (Mexico)

    2009-01-15

    This article presents the analysis realized to the urban design of a colony applying the indicators of The Leadership in Energy (LEED-ND). The advantages that represent as far as impact are established and also the limits are pointed out, evaluating their efficiency in the application of indicators to improve performance and energy saving. Based on the analysis applied to the colony under study, some right solutions in the urban design are obtained that should be established as a part of the in force standardization. Nevertheless, the follow up to this same company in other developments, reflects that the proposals are not determined as a strategy of self planning but only to fulfill the asked requirements, obtaining a result with smaller impact and as an index that allows offering residential alternatives in the city tending to the sustained development. [Spanish] Este articulo presenta el analisis realizado al diseno urbano de un fraccionamiento aplicando los indicadores de The Leadership in Energy (LEED-ND). Se establecen las ventajas que presenta en cuanto a impacto y tambien se senalan las limitantes, evaluando su eficiencia en la aplicacion de indicadores para mejorar desempeno y ahorro energetico. Con base en el analisis aplicado al fraccionamiento en estudio, se tienen algunas soluciones acertadas en el diseno urbano que debieran establecerse como parte de la normatividad vigente, sin embargo, el seguimiento a esta misma empresa en otros desarrollos, refleja que las propuestas no estan determinadas como una estrategia de planeacion propia sino unicamente para cumplir con los requerimientos solicitados, obteniendo un resultado con menor impacto y como indice que permitan ofrecer alternativas habitacionales en la ciudad tendientes al desarrollo sostenido.

  6. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  7. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  8. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  9. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  10. Statistical dependence of clinical data on the chosen treatment of patients with a multivessel coronary artery disease.

    Science.gov (United States)

    Walichiewicz, P; Wodniecki, J; Szczurek-Katański, K; Jacheć, W; Nowalany-Kozielska, E; Trzeciak, P; Janik, J

    2001-01-01

    In this study we tried to check which clinical data are connected with the choice of treatment in patients with a multivessel coronary artery disease. The data of 137 patients with a multivessel coronary artery disease, were analysed retrospectively. The patients were divided into three groups: treated conservatively, CABG and PTCA. Multivessel coronary artery disease was recognised when there were atherosclerotic changes in more a 2 vessels with a not less a 2 mm in diameter. Patients with previous CABG or a left main coronary artery disease were excluded. The data were analysed by means of several methods, variance analysis, correlation analysis, discriminant functions, chi-square test and T-Student test. For treatment decision making in multivessel coronary artery disease of statistical significance were: the state of the left anterior descendent artery below the first diagonal branch, the state of the first diagonal branch and peripheral parts of the left anterior descendent artery and right coronary artery, the systolic function of the antero-lateral, apical and phrenic segments of the left ventricle, the global left ventricular ejection fraction in angiography and echocardiography, local systolic disturbances of the left ventricular observed in echocardiography, the coexistence of symptoms of heart failure as well as unstable angina. Treatment decision making will always depend not only on diagnostic procedures but also on all the clinical data about the patient and the experience of coworking cardiology and surgery centres.

  11. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  12. Validity and reliability of the Spanish-language version of the self-administered Leeds Assessment of Neuropathic Symptoms and Signs (S-LANSS) pain scale.

    Science.gov (United States)

    López-de-Uralde-Villanueva, I; Gil-Martínez, A; Candelas-Fernández, P; de Andrés-Ares, J; Beltrán-Alacreu, H; La Touche, R

    2016-12-08

    The self-administered Leeds Assessment of Neuropathic Symptoms and Signs (S-LANSS) scale is a tool designed to identify patients with pain with neuropathic features. To assess the validity and reliability of the Spanish-language version of the S-LANSS scale. Our study included a total of 182 patients with chronic pain to assess the convergent and discriminant validity of the S-LANSS; the sample was increased to 321 patients to evaluate construct validity and reliability. The validated Spanish-language version of the ID-Pain questionnaire was used as the criterion variable. All participants completed the ID-Pain, the S-LANSS, and the Numerical Rating Scale for pain. Discriminant validity was evaluated by analysing sensitivity, specificity, and the area under the receiver operating characteristic curve (AUC). Construct validity was assessed with factor analysis and by comparing the odds ratio of each S-LANSS item to the total score. Convergent validity and reliability were evaluated with Pearson's r and Cronbach's alpha, respectively. The optimal cut-off point for S-LANSS was ≥12 points (AUC=.89; sensitivity=88.7; specificity=76.6). Factor analysis yielded one factor; furthermore, all items contributed significantly to the positive total score on the S-LANSS (P<.05). The S-LANSS showed a significant correlation with ID-Pain (r=.734, α=.71). The Spanish-language version of the S-LANSS is valid and reliable for identifying patients with chronic pain with neuropathic features. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Development of the Japanese Version of the Leeds Assessment of the Neuropathic Symptoms and Signs Pain Scale: Diagnostic Utility in a Clinical Setting.

    Science.gov (United States)

    Isomura, Tatsuya; Sumitani, Masahiko; Matsudaira, Ko; Kawaguchi, Mika; Inoue, Reo; Hozumi, Jun; Tanaka, Takeyuki; Oshima, Hirofumi; Mori, Kanto; Taketomi, Shuji; Inui, Hiroshi; Tahara, Keitaro; Yamagami, Ryota; Hayakawa, Kazuhiro

    2017-07-01

    We aimed to assess the diagnostic utility of the linguistically validated Japanese version of the Leeds Assessment of Neuropathic Symptoms and Signs Pain Scale (LANSS-J) as a screening tool for neuropathic pain in the clinical setting. Patients with neuropathic pain or nociceptive pain who were 20 to 85 years of age were included. Sensitivity and specificity using the original cutoff value of 12 were assessed to evaluate the diagnostic utility of the LANSS-J. Sensitivity and specificity with possible cutoff values were calculated, along with area under the receiver operating characteristic curve. We then evaluated agreement regarding assessment of the LANSS-J by two investigators. We used the intraclass correlation coefficient (ICC) for the total score and Cohen's kappa coefficient for each item. Data for patients with neuropathic pain (n = 30) and those with nociceptive pain (n = 29) were analyzed. With a cutoff of 12, the sensitivity was 63.3% (19/30) and the specificity 93.1% (27/29). Sensitivity improved substantially with a cutoff of ≤ 11 (≥ 83.3%, 25/30). High specificity (93.1%, 27/29) was sustained with a cutoff of 9 to 12. The ICC for the total score was 0.85, indicating sufficient agreement. Kappa coefficients ranged from 0.68 to 0.84. The LANSS-J is a valid screening tool for detecting neuropathic pain. Our results suggest that employing the original cutoff value provides high specificity, although a lower cutoff value of 10 or 11 (with its high specificity maintained) may be more beneficial when pain attributed to neuropathic mechanisms is suspected in Japanese patients. © 2016 World Institute of Pain.

  14. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  15. Attuali sviluppi nello studio dell’associazione tra variabili di tipo qualitativo.

    Directory of Open Access Journals (Sweden)

    C Antonelli

    1996-02-01

    Full Text Available The most common procedure for analyzing contingency table data is by using chi-square statistic. The early development of chi-square analysis of contingency table is credited to Pearson (1904 and Fisher (1929, successively expanded by Yates, Mantel-Haenszcl etc. In this paper some developments of the chi-square function has been outlined, particularly as statistical test for the null hypothesis of independence, for subdividing contingency tables and the chi-square test for linear association between ordinal variables.

  16. [Characteristics of Hospitalized Mentally ill Spanish Migrants in Germany - Results of a Statistical Reanalysis].

    Science.gov (United States)

    Valdés-Stauber, J; Valdés-Stauber, M A

    2015-10-01

    To draft a clinical profile of mentally ill first-generation Spanish immigrants in Germany treated in a special setting in their native language and to identify possible correlations between time of onset of a mental disorder and migration and also between degree of utilization and clinical as well as care variables. Statistical reanalysis of individual data (n = 100) of a previously published descriptive study with aggregated data corresponding to 15 variables. Correlations are calculated using chi-square as well as Fisher's exact test. Multivariate regression and logistic models were conducted. In addition to the explained variance of the models (R(2)), analyses of residuals as well as post-hoc power analyses (1-β) were performed. A quarter of the sample (26 %) was mentally ill before migration; most of the patients received treatment very late (about 10 years after onset) and became chronically ill. Half of the sample shows a relevant somatic comorbidity and large average lengths of inpatient stays (54 days). In 16 % of treated cases, repatriation had to be organized. The degree of chronicity correlates with mental illness prior to migration. Severe mood disorders and psychoses occur late after having migrated, addictions and neurotic disorders are equally distributed over time. Migration can not be set in a causal relationship with the development of mental disorders, although there is a positive correlation between affective disorders and the duration of the migration status. Chronicity is related to an outbreak of the disease before migration. The sample is relatively homogeneous (one nationality, first generation), but loses epidemiological representativeness (not related to a catchment area). © Georg Thieme Verlag KG Stuttgart · New York.

  17. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial

  18. Lessons Learned from the 200 West Pump and Treatment Facility Construction Project at the US DOE Hanford Site - A Leadership for Energy and Environmental Design (LEED) Gold-Certified Facility - 13113

    Energy Technology Data Exchange (ETDEWEB)

    Dorr, Kent A.; Freeman-Pollard, Jhivaun R.; Ostrom, Michael J. [CH2M HILL Plateau Remediation Company, P.O. Box 1600, MSIN R4-41, 99352 (United States)

    2013-07-01

    CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built to an accelerated schedule with American Recovery and Reinvestment Act (ARRA) funds. There were many contractual, technical, configuration management, quality, safety, and Leadership in Energy and Environmental Design (LEED) challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility to meet DOE's mission objective of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012. The project team's successful integration of the project's core values and green energy technology throughout design, procurement, construction, and start-up of this complex, first-of-its-kind Bio Process facility resulted in successful achievement of DOE's mission objective, as well as attainment of LEED GOLD certification (Figure 1), which makes this Bio Process facility the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. (authors)

  19. Lessons Learned from the 200 West Pump and Treatment Facility Construction Project at the US DOE Hanford Site - A Leadership for Energy and Environmental Design (LEED) Gold-Certified Facility - 13113

    International Nuclear Information System (INIS)

    Dorr, Kent A.; Freeman-Pollard, Jhivaun R.; Ostrom, Michael J.

    2013-01-01

    CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built to an accelerated schedule with American Recovery and Reinvestment Act (ARRA) funds. There were many contractual, technical, configuration management, quality, safety, and Leadership in Energy and Environmental Design (LEED) challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility to meet DOE's mission objective of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012. The project team's successful integration of the project's core values and green energy technology throughout design, procurement, construction, and start-up of this complex, first-of-its-kind Bio Process facility resulted in successful achievement of DOE's mission objective, as well as attainment of LEED GOLD certification (Figure 1), which makes this Bio Process facility the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. (authors)

  20. Lessons Learned from the 200 West Pump and Treatment Facility Construction Project at the US DOE Hanford Site - A Leadership for Energy and Environmental Design (LEED) Gold-Certified Facility

    Energy Technology Data Exchange (ETDEWEB)

    Dorr, Kent A.; Ostrom, Michael J.; Freeman-Pollard, Jhivaun R.

    2013-01-11

    CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy’s (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built to an accelerated schedule with American Recovery and Reinvestment Act (ARRA) funds. There were many contractual, technical, configuration management, quality, safety, and Leadership in Energy and Environmental Design (LEED) challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility to meet DOE’s mission objective of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012. The project team’s successful integration of the project’s core values and green energy technology throughout design, procurement, construction, and start-up of this complex, first-of-its-kind Bio Process facility resulted in successful achievement of DOE’s mission objective, as well as attainment of LEED GOLD certification, which makes this Bio Process facility the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award.

  1. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  2. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  3. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  4. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  5. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  6. antibodies against Herpes simplex virus (HSV)

    African Journals Online (AJOL)

    Chi-square analysis was used to determine the association of infection with ... tibody. No statistical association existed between the prevalence of HSV-1&-2 IgG antibodies and the socio-demographic variables ... concern, established by the widespread of genital HSV .... Chi-square test was employed to define relationships.

  7. Spatial and statistical methods for correlating the interaction between groundwater contamination and tap water exposure in karst regions

    Science.gov (United States)

    Padilla, I. Y.; Rivera, V. L.; Macchiavelli, R. E.; Torres Torres, N. I.

    2016-12-01

    Groundwater systems in karst regions are highly vulnerable to contamination and have an enormous capacity to store and rapidly convey pollutants to potential exposure zones over long periods of time. Contaminants in karst aquifers used for drinking water purposes can, therefore, enter distributions lines and the tap water point of use. This study applies spatial and statistical analytical methods to assess potential correlations between contaminants in a karst groundwater system in northern Puerto Rico and exposure in the tap water. It focuses on chlorinated volatile organic compounds (CVOC) and phthalates because of their ubiquitous presence in the environment and the potential public health impacts. The work integrates historical data collected from regulatory agencies and current field measurements involving groundwater and tap water sampling and analysis. Contaminant distributions and cluster analysis is performed with Geographic Information System technology. Correlations between detection frequencies and contaminants concentration in source groundwater and tap water point of use are assessed using Pearson's Chi Square and T-Test analysis. Although results indicate that correlations are contaminant-specific, detection frequencies are generally higher for total CVOC in groundwater than tap water samples, but greater for phthalates in tap water than groundwater samples. Spatial analysis shows widespread distribution of CVOC and phthalates in both groundwater and tap water, suggesting that contamination comes from multiple sources. Spatial correlation analysis indicates that association between tap water and groundwater contamination depends on the source and type of contaminants, spatial location, and time. Full description of the correlations may, however, need to take into consideration variable anthropogenic interventions.

  8. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  9. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  10. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  11. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  12. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  13. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  14. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  15. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  16. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  17. Statistical Examination of the Resolution of a Block-Scale Urban Drainage Model

    Science.gov (United States)

    Goldstein, A.; Montalto, F. A.; Digiovanni, K. A.

    2009-12-01

    Stormwater drainage models are utilized by cities in order to plan retention systems to prevent combined sewage overflows and design for development. These models aggregate subcatchments and ignore small pipelines providing a coarse representation of a sewage network. This study evaluates the importance of resolution by comparing two models developed on a neighborhood scale for predicting the total quantity and peak flow of runoff to observed runoff measured at the site. The low and high resolution models were designed for a 2.6 ha block in Bronx, NYC in EPA Stormwater Management Model (SWMM) using a single catchment and separate subcatchments based on surface cover, respectively. The surface covers represented included sidewalks, street, buildings, and backyards. Characteristics for physical surfaces and the infrastructure in the high resolution mode were determined from site visits, sewer pipe maps, aerial photographs, and GIS data-sets provided by the NYC Department of City Planning. Since the low resolution model was depicted at a coarser scale, generalizations were assumed about the overall average characteristics of the catchment. Rainfall and runoff data were monitored over a four month period during the summer rainy season. A total of 53 rain fall events were recorded but only 29 storms produced significant amount of runoffs to be evaluated in the simulations. To determine which model was more accurate at predicting the observed runoff, three characteristics for each storm were compared: peak runoff, total runoff, and time to peak. Two statistical tests were used to determine the significance of the results: the percent difference for each storm and the overall Chi-squared Goodness of Fit distribution for both the low and high resolution model. These tests will evaluate if there is a statistical difference depending on the resolution of scale of the stormwater model. The scale of representation is being evaluated because it could have a profound impact on

  18. Adrenal and nephrogenic hypertension: an image quality study of low tube voltage, low-concentration contrast media combined with adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Li, Zhen; Li, Qiong; Shen, Yaqi; Li, Anqin; Li, Haojie; Liang, Lili; Hu, Yao; Hu, Xuemei; Hu, Daoyu

    2016-09-01

    The aim of this study was to investigate the effect of using low tube voltage, low-concentration contrast media and adaptive statistical iterative reconstruction (ASIR) for reducing the radiation and iodine contrast doses in adrenal and nephrogenic hypertension patients. A total of 148 hypertension patients who were suspected for adrenal lesions or renal artery stenoses were assigned to two groups and. Group A (n=74) underwent a low tube voltage, low molecular weight dextran enhanced multi-detector row spiral CT (MDCT) (80 kVp, 270 mg I/mL contrast agent), and the raw data were reconstructed with standard filtered back projection (FBP) and ASIR at four different levels of blending (20%, 40%, 60% and 80%, respectively). The control group (Group B, n=74) underwent conventional MDCT (120 kVp, 370 mg I/mL contrast agent), and the data were reconstructed with FBP. The CT values, standard deviation (SD), signal-noise-ratio (SNR) and contrast-noise-ratio (CNR) were measured in the renal vessels, normal adrenal tissue, adrenal neoplasms and subcutaneous fat. The volume CT dose index (CTDIvol ) and dose length product (DLP) were recorded, and an effective dose (ED) was obtained. Two-tailed independent t-tests, paired Chi-square tests and Kappa consistency tests were used for statistical analysis of the data. The CTDIvol , DLP and total iodine dose in group A were decreased by 47.8%, 49.0% and 26.07%, respectively, compared to group B (Pcontrast media and 60% ASIR provides similar enhancement and image quality with a reduced radiation dose and contrast iodine dose. © 2016 John Wiley & Sons Ltd.

  19. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  20. Transport Statistics - Transport - UNECE

    Science.gov (United States)

    Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6

  1. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  2. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.

  3. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  4. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  5. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    Science.gov (United States)

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  6. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  7. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  8. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  9. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  10. Recreational Boating Statistics 2011

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  11. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  12. Tuberculosis Data and Statistics

    Science.gov (United States)

    ... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...

  13. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  14. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  15. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  16. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...

  17. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  18. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  19. Alcohol Facts and Statistics

    Science.gov (United States)

    ... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...

  20. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  1. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  2. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  3. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  4. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  5. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  6. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  7. Youth Sports Safety Statistics

    Science.gov (United States)

    ... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...

  8. Impact of Students' Financial Strength on their Academic Performance

    African Journals Online (AJOL)

    Toshiba

    The statistical tests we applied are Chi-square, Phi ..... advise/encourage looking for finance through the most adequate source .... The data collected was subjected to the statistical techniques described .... Bivariate and Multivariate Analysis.

  9. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  10. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  11. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  12. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  13. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  14. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  15. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  16. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  17. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  18. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  19. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  20. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  1. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  2. Mineral industry statistics 1975

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)

  3. Lectures on statistical mechanics

    CERN Document Server

    Bowler, M G

    1982-01-01

    Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

  4. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  5. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  6. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  7. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  8. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  9. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  10. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  11. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  12. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  13. Annual Statistical Supplement, 2016

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  14. Annual Statistical Supplement, 2011

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  15. Annual Statistical Supplement, 2005

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  16. Annual Statistical Supplement, 2015

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  17. Annual Statistical Supplement, 2003

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  18. Annual Statistical Supplement, 2017

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  19. Annual Statistical Supplement, 2008

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  20. Annual Statistical Supplement, 2014

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  1. Annual Statistical Supplement, 2004

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  2. Annual Statistical Supplement, 2000

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  3. Annual Statistical Supplement, 2009

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  4. Annual Statistical Supplement, 2006

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  5. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  6. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  7. Statistical distribution sampling

    Science.gov (United States)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  8. The number of degrees of freedom for statistical distribution of s wave reduced neutron width for several nuclei

    International Nuclear Information System (INIS)

    Zhixiang, Z.

    1983-01-01

    The least squares fit has been performed using chi-squared distribution function for all available evaluated data for s-wave reduced neutron width of several nuclei. The number of degrees of freedom and average value have been obtained. The missing levels of weak s-wave resonances and extra p-wave levels have been taken into account, if any. For 75 As and 103 Rh, s-wave population has been separated by Bayes' theorem before making fit. The results thus obtained are consistent with Porter-Thomas distribution, i.e., chi-squared distribution with γ=1, as one would expect. It has not been found in this work that the number of degrees of freedom for the distribution of s-wave reduced neutron width might be greater than one as reported by H.C.Sharma et al. (1976) at the international conference on interactions of neutrons with nuclei. (Auth.)

  9. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    IAS Admin

    Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.

  10. Generalized interpolative quantum statistics

    International Nuclear Information System (INIS)

    Ramanathan, R.

    1992-01-01

    A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently

  11. Handbook of Spatial Statistics

    CERN Document Server

    Gelfand, Alan E

    2010-01-01

    Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.

  12. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  13. Statistical tables 2003

    International Nuclear Information System (INIS)

    2003-01-01

    The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)

  14. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  15. Practical statistics for educators

    CERN Document Server

    Ravid, Ruth

    2014-01-01

    Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.

  16. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...

  17. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  18. Cancer Statistics Animator

    Science.gov (United States)

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  19. Energy statistics yearbook 2002

    International Nuclear Information System (INIS)

    2005-01-01

    The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  20. Advances in statistics

    Science.gov (United States)

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  1. Energy statistics yearbook 2001

    International Nuclear Information System (INIS)

    2004-01-01

    The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  2. Energy statistics yearbook 2000

    International Nuclear Information System (INIS)

    2002-01-01

    The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  3. Temperature dependent anomalous statistics

    International Nuclear Information System (INIS)

    Das, A.; Panda, S.

    1991-07-01

    We show that the anomalous statistics which arises in 2 + 1 dimensional Chern-Simons gauge theories can become temperature dependent in the most natural way. We analyze and show that a statistic's changing phase transition can happen in these theories only as T → ∞. (author). 14 refs

  4. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  5. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  6. Statistics at a glance.

    Science.gov (United States)

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  7. Statistical mechanics in JINR

    International Nuclear Information System (INIS)

    Tonchev, N.; Shumovskij, A.S.

    1986-01-01

    The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given

  8. The nature of statistics

    CERN Document Server

    Wallis, W Allen

    2014-01-01

    Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,

  9. AP statistics crash course

    CERN Document Server

    D'Alessio, Michael

    2012-01-01

    AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da

  10. Statistical deception at work

    CERN Document Server

    Mauro, John

    2013-01-01

    Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g

  11. Statistical Group Comparison

    CERN Document Server

    Liao, Tim Futing

    2011-01-01

    An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde

  12. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  13. Mineral statistics yearbook 1994

    International Nuclear Information System (INIS)

    1994-01-01

    A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs

  14. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  15. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  16. Cancer Data and Statistics Tools

    Science.gov (United States)

    ... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...

  17. Elements of statistical thermodynamics

    CERN Document Server

    Nash, Leonard K

    2006-01-01

    Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.

  18. VA PTSD Statistics

    Data.gov (United States)

    Department of Veterans Affairs — National-level, VISN-level, and/or VAMC-level statistics on the numbers and percentages of users of VHA care form the Northeast Program Evaluation Center (NEPEC)....

  19. Statistical nuclear reactions

    International Nuclear Information System (INIS)

    Hilaire, S.

    2001-01-01

    A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)

  20. Plague Maps and Statistics

    Science.gov (United States)

    ... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... them at higher risk. Reported Cases of Human Plague - United States, 1970-2016 Since the mid–20th ...

  1. Statistical Measures of Marksmanship

    National Research Council Canada - National Science Library

    Johnson, Richard

    2001-01-01

    .... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...

  2. Data and Statistics

    Science.gov (United States)

    ... About Us Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir Sickle cell ... 1999 through 2002. This drop coincided with the introduction in 2000 of a vaccine that protects against ...

  3. CDC WONDER: Cancer Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...

  4. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  5. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  6. CMS Statistics Reference Booklet

    Data.gov (United States)

    U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...

  7. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  8. Statistical electromagnetics: Complex cavities

    NARCIS (Netherlands)

    Naus, H.W.L.

    2008-01-01

    A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased

  9. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.; Huser, Raphaë l

    2015-01-01

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event

  10. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  11. Infant Statistical Learning

    Science.gov (United States)

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  12. Transport statistics 1996

    CSIR Research Space (South Africa)

    Shepperson, L

    1997-12-01

    Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...

  13. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  14. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.; Castruccio, Stefano; Crippa, Paola; Dutta, Subhajit; Huser, Raphaë l; Sun, Ying; Vettori, Sabrina

    2015-01-01

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online

  15. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  16. Illinois travel statistics, 2008

    Science.gov (United States)

    2009-01-01

    The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  17. Illinois travel statistics, 2009

    Science.gov (United States)

    2010-01-01

    The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  18. Cholesterol Facts and Statistics

    Science.gov (United States)

    ... Managing High Cholesterol Cholesterol-lowering Medicine High Cholesterol Statistics and Maps High Cholesterol Facts High Cholesterol Maps ... Deo R, et al. Heart disease and stroke statistics—2017 update: a report from the American Heart ...

  19. Illinois travel statistics, 2010

    Science.gov (United States)

    2011-01-01

    The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  20. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  1. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  2. Boating Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  3. Medicaid Drug Claims Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.

  4. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  5. Record Statistics and Dynamics

    DEFF Research Database (Denmark)

    Sibani, Paolo; Jensen, Henrik J.

    2009-01-01

    with independent random increments. The term record dynamics covers the rather new idea that records may, in special situations, have measurable dynamical consequences. The approach applies to the aging dynamics of glasses and other systems with multiple metastable states. The basic idea is that record sizes...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....

  6. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  7. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  8. Statistical mechanics of anyons

    International Nuclear Information System (INIS)

    Arovas, D.P.

    1985-01-01

    We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)

  9. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  10. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  11. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  12. Knowledge, Perception and Level of Male Partner Involvement in ...

    African Journals Online (AJOL)

    Knowledge, Perception and Level of Male Partner Involvement in Choice of ... many practice this and the influence of the couple knowledge and perception on male ... Chi-square and binary logistic regression were used for statistical analysis.

  13. Awareness, knowledge and uptake of preconception care among ...

    African Journals Online (AJOL)

    Awareness, knowledge and uptake of preconception care among women in ... care to increase the chances of healthy outcomes of pregnancy for both mother and child. ... Descriptive statistics as well as Chi-square analysis was done to show ...

  14. utilization of tortoise (chelonoides nigra, quoy and gaimard, 1824)

    African Journals Online (AJOL)

    Tersor

    were sampled and the data collected were analysed using descriptive statistics and Chi square. The study ..... A big tortoise costs between 1,500 and ..... Publishing Network Journal of Agriculture ... Malaysia, Traffic soughteast Asia, Petaling.

  15. Knowledge, Attitude and Practice of Health Care Professionals ...

    African Journals Online (AJOL)

    ... of Health Care Professionals towards Voluntary Counseling and Testing for HIV/AIDS ... the chi square test; p value of < 0.05 was considered statistically significant. Multiple logistic regressions were performed to identify predictive variables ...

  16. Maternal Neonatal Outcome in Relation to Placental Location ...

    African Journals Online (AJOL)

    Mubeen

    Journal of Basic and Clinical Reproductive Sciences ・ July - December 2013 ... Background: Placenta, which is the vital link between mother and fetus, ... Georgia, USA) using Chi-square test and Fischer exact test for determining the statistical.

  17. Samaru Journal-2012- editedx

    African Journals Online (AJOL)

    Library _info_Sc_ 1

    They use diverse EIS especially WWW to perform different tasks of ... Chi square statistics indicates that there is no significant difference in most of the ... the computers, laptops, mobile phones, Internet .... more important than the work place.

  18. How ophthalmologists and ophthalmologists-in-training in Nigeria ...

    African Journals Online (AJOL)

    Background: The social media has revolutionized the practice of medicine in the area of communication and information dissemination. Aim: This study ... Simple statistics and comparisons of associated variables were made using Chi-square.

  19. UN Data- Environmental Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  20. UN Data: Environment Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  1. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  2. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  3. Wind energy statistics

    International Nuclear Information System (INIS)

    Holttinen, H.; Tammelin, B.; Hyvoenen, R.

    1997-01-01

    The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)

  4. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  5. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  6. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  7. 1992 Energy statistics Yearbook

    International Nuclear Information System (INIS)

    1994-01-01

    The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from annual questionnaires distributed by the United Nations Statistical Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistical Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  8. Energy statistics manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  9. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  10. Philosophy of statistics

    CERN Document Server

    Forster, Malcolm R

    2011-01-01

    Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.

  11. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  12. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  13. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and

  14. Statistics As Principled Argument

    CERN Document Server

    Abelson, Robert P

    2012-01-01

    In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative

  15. 1997 statistical yearbook

    International Nuclear Information System (INIS)

    1998-01-01

    The international office of energy information and studies (Enerdata), has published the second edition of its 1997 statistical yearbook which includes consolidated 1996 data with respect to the previous version from June 1997. The CD-Rom comprises the annual worldwide petroleum, natural gas, coal and electricity statistics from 1991 to 1996 with information about production, external trade, consumption, market shares, sectoral distribution of consumption and energy balance sheets. The world is divided into 12 zones (52 countries available). It contains also energy indicators: production and consumption tendencies, supply and production structures, safety of supplies, energy efficiency, and CO 2 emissions. (J.S.)

  16. Einstein's statistical mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Baracca, A; Rechtman S, R

    1985-08-01

    The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject.

  17. Einstein's statistical mechanics

    International Nuclear Information System (INIS)

    Baracca, A.; Rechtman S, R.

    1985-01-01

    The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject. (author)

  18. Elementary Statistics Tables

    CERN Document Server

    Neave, Henry R

    2012-01-01

    This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

  19. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs.

  20. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  1. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  2. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  3. Environmental accounting and statistics

    International Nuclear Information System (INIS)

    Bartelmus, P.L.P.

    1992-01-01

    The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs

  4. Radiation counting statistics

    International Nuclear Information System (INIS)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs

  5. STATISTIC MODELING OF DRYING KINETHIC OF SPINACH LEAVES USING MICROWAVE AND HOT AIR METHODS

    OpenAIRE

    Mojtaba Nouri; Marzieh Vahdani; Shilan Rashidzadeh; Lukáš Hleba; Mohammad Ali Shariati

    2015-01-01

    The target of this study was to model of spinach leaves drying using microwave and hot air dryer. This test performed in combination treatment of temperatures (50°C, 60°C, and 70°C) and microwave (90, 180, 360, 600 and 900w) in 3 replications. Sample moisture measured within drying. All the results were fitted and analyzed with 8 mathematical models base on 3 parameters including determination (R2), Chi square(X2), root mean square errors(RSME). Results also revealed that temperature and micr...

  6. Fisher's Contributions to Statistics

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 9. Fisher's Contributions to Statistics. T Krishnan. General Article Volume 2 Issue 9 September 1997 pp 32-37. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/002/09/0032-0037. Author Affiliations.

  7. ASURV: Astronomical SURVival Statistics

    Science.gov (United States)

    Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.

    2014-06-01

    ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.

  8. Elementary statistical physics

    CERN Document Server

    Kittel, C

    1965-01-01

    This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.

  9. Whither Statistics Education Research?

    Science.gov (United States)

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  10. SAPS, Crime statistics

    African Journals Online (AJOL)

    incidents' refer to 'incidents such as labour disputes and dissatisfaction with service delivery in which violence erupted and SAPS action was required to restore peace and order'.26. It is apparent from both the SAPS statistics and those provided by the Municipal IQ Hotspots. Monitor, that public protests and gatherings are.

  11. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  12. Illinois forest statistics, 1985.

    Science.gov (United States)

    Jerold T. Hahn

    1987-01-01

    The third inventory of the timber resource of Illinois shows a 1% increase in commercial forest area and a 40% gain in growing-stock volume between 1962 and 1985. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.

  13. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2013-11-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  14. Introduction to statistics

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.

  15. Introduction to Statistics course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The four lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.

  16. Fisher's Contributions to Statistics

    Indian Academy of Sciences (India)

    of statistics are multifarious, profound and long-lasting. In fact, he can be ... that it is not even possible to mention them all in this short article. ... At that time the term 'likelihood' as oppo- .... Dedicated to the memory of Fisher soon after his death,.

  17. Michigan forest statistics, 1980.

    Science.gov (United States)

    Gerhard K. Raile; W. Brad Smith

    1983-01-01

    The fourth inventory of the timber resource of Michigan shows a 7% decline in commercial forest area and a 27% gain in growing-stock volume between 1966 and 1980. Highlights and statistics are presented on area, volume, growth, mortality, removals, utilization, and biomass.

  18. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2012-07-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  19. Geometric statistical inference

    International Nuclear Information System (INIS)

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  20. Introduction to statistics

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.

  1. Hemophilia Data and Statistics

    Science.gov (United States)

    ... View public health webinars on blood disorders Data & Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... genetic testing is done to diagnose hemophilia before birth. For the one-third ... rates and hospitalization rates for bleeding complications from hemophilia ...

  2. Statistical learning and prejudice.

    Science.gov (United States)

    Madison, Guy; Ullén, Fredrik

    2012-12-01

    Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.

  3. The Pleasures of Statistics

    CERN Document Server

    Mosteller, Frederick; Hoaglin, David C; Tanur, Judith M

    2010-01-01

    Includes chapter-length insider accounts of work on the pre-election polls of 1948, statistical aspects of the Kinsey report on sexual behavior in the human male, mathematical learning theory, authorship of the disputed Federalist papers, safety of anesthetics, and an examination of the Coleman report on equality of educational opportunity

  4. Juvenile Court Statistics - 1972.

    Science.gov (United States)

    Office of Youth Development (DHEW), Washington, DC.

    This report is a statistical study of juvenile court cases in 1972. The data demonstrates how the court is frequently utilized in dealing with juvenile delinquency by the police as well as by other community agencies and parents. Excluded from this report are the ordinary traffic cases handled by juvenile court. The data indicate that: (1) in…

  5. Juvenile Court Statistics, 1974.

    Science.gov (United States)

    Corbett, Jacqueline; Vereb, Thomas S.

    This report presents information on juvenile court processing of youth in the U.S. during 1974. It is based on data gathered under the National Juvenile Court Statistical Reporting System. Findings can be summarized as follows: (1) 1,252,700 juvenile delinquency cases, excluding traffic offenses, were handled by courts in the U.S. in 1974; (2) the…

  6. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.

  7. Thermodynamics for Fractal Statistics

    OpenAIRE

    da Cruz, Wellington

    1998-01-01

    We consider for an anyon gas its termodynamics properties taking into account the fractal statistics obtained by us recently. This approach describes the anyonic excitations in terms of equivalence classes labeled by fractal parameter or Hausdorff dimension $h$. An exact equation of state is obtained in the high-temperature and low-temperature limits, for gases with a constant density of states.

  8. Statistical Hadronization and Holography

    DEFF Research Database (Denmark)

    Bechi, Jacopo

    2009-01-01

    In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal...

  9. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  10. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  11. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  12. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  13. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  14. Topics in Statistical Calibration

    Science.gov (United States)

    2014-03-27

    Natural cubic spline speed di st 110 B.2 The calibrate function The most basic calibration problem, the one often encountered in more advanced ...0040-1706, 1537-2723. A. M. Mood, F. A. Graybill, and D. C. Boes. Introduction to the Theory of Statistics. McGraw-Hill, Auckland , U.A, 1974. ISBN

  15. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2001-01-01

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics.

  16. Statistical mechanics of solitons

    International Nuclear Information System (INIS)

    Bishop, A.

    1980-01-01

    The status of statistical mechanics theory (classical and quantum, statics and dynamics) is reviewed for 1-D soliton or solitary-wave-bearing systems. Primary attention is given to (i) perspective for existing results with evaluation and representative literature guide; (ii) motivation and status report for remaining problems; (iii) discussion of connections with other 1-D topics

  17. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  18. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Science.gov (United States)

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  19. The Statistical Fermi Paradox

    Science.gov (United States)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  20. Bose and his statistics

    International Nuclear Information System (INIS)

    Venkataraman, G.

    1992-01-01

    Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)

  1. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.

    2015-04-14

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Fermions from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    We describe fermions in terms of a classical statistical ensemble. The states τ of this ensemble are characterized by a sequence of values one or zero or a corresponding set of two-level observables. Every classical probability distribution can be associated to a quantum state for fermions. If the time evolution of the classical probabilities p τ amounts to a rotation of the wave function q τ (t)=±√(p τ (t)), we infer the unitary time evolution of a quantum system of fermions according to a Schroedinger equation. We establish how such classical statistical ensembles can be mapped to Grassmann functional integrals. Quantum field theories for fermions arise for a suitable time evolution of classical probabilities for generalized Ising models.

  3. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  4. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....

  5. Statistics for experimentalists

    CERN Document Server

    Cooper, B E

    2014-01-01

    Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...

  6. Statistical theory and inference

    CERN Document Server

    Olive, David J

    2014-01-01

    This text is for  a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful  tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.

  7. Classical and statistical thermodynamics

    CERN Document Server

    Rizk, Hanna A

    2016-01-01

    This is a text book of thermodynamics for the student who seeks thorough training in science or engineering. Systematic and thorough treatment of the fundamental principles rather than presenting the large mass of facts has been stressed. The book includes some of the historical and humanistic background of thermodynamics, but without affecting the continuity of the analytical treatment. For a clearer and more profound understanding of thermodynamics this book is highly recommended. In this respect, the author believes that a sound grounding in classical thermodynamics is an essential prerequisite for the understanding of statistical thermodynamics. Such a book comprising the two wide branches of thermodynamics is in fact unprecedented. Being a written work dealing systematically with the two main branches of thermodynamics, namely classical thermodynamics and statistical thermodynamics, together with some important indexes under only one cover, this treatise is so eminently useful.

  8. Evaluating statistical cloud schemes

    OpenAIRE

    Grützun, Verena; Quaas, Johannes; Morcrette , Cyril J.; Ament, Felix

    2015-01-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based re...

  9. 1979 DOE statistical symposium

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, D.A.; Truett T. (comps. and eds.)

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  10. Statistics I essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.

  11. Journey Through Statistical Mechanics

    Science.gov (United States)

    Yang, C. N.

    2013-05-01

    My first involvement with statistical mechanics and the many body problem was when I was a student at The National Southwest Associated University in Kunming during the war. At that time Professor Wang Zhu-Xi had just come back from Cambridge, England, where he was a student of Fowler, and his thesis was on phase transitions, a hot topic at that time, and still a very hot topic today...

  12. The Generalized Quantum Statistics

    OpenAIRE

    Hwang, WonYoung; Ji, Jeong-Young; Hong, Jongbae

    1999-01-01

    The concept of wavefunction reduction should be introduced to standard quantum mechanics in any physical processes where effective reduction of wavefunction occurs, as well as in the measurement processes. When the overlap is negligible, each particle obey Maxwell-Boltzmann statistics even if the particles are in principle described by totally symmetrized wavefunction [P.R.Holland, The Quantum Theory of Motion, Cambridge Unversity Press, 1993, p293]. We generalize the conjecture. That is, par...

  13. 1979 DOE statistical symposium

    International Nuclear Information System (INIS)

    Gardiner, D.A.; Truett, T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation

  14. Symmetry and statistics

    International Nuclear Information System (INIS)

    French, J.B.

    1974-01-01

    The concepts of statistical behavior and symmetry are presented from the point of view of many body spectroscopy. Remarks are made on methods for the evaluation of moments, particularly widths, for the purpose of giving a feeling for the types of mathematical structures encountered. Applications involving ground state energies, spectra, and level densities are discussed. The extent to which Hamiltonian eigenstates belong to irreducible representations is mentioned. (4 figures, 1 table) (U.S.)

  15. Reading Statistics And Research

    OpenAIRE

    Akbulut, Reviewed By Yavuz

    2008-01-01

    The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...

  16. 2002 energy statistics

    International Nuclear Information System (INIS)

    2003-01-01

    This report has 12 chapters. The first chapter includes world energy reserves, the second chapter is about world primary energy production and consumption condition. Other chapters include; world energy prices, energy reserves in Turkey, Turkey primary energy production and consumption condition, Turkey energy balance tables, Turkey primary energy reserves production, consumption, imports and exports conditions, sectoral energy consumptions, Turkey secondary electricity plants, Turkey energy investments, Turkey energy prices.This report gives world and Turkey statistics on energy

  17. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  18. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  19. Testing statistical hypotheses

    CERN Document Server

    Lehmann, E L

    2005-01-01

    The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...

  20. Overdispersion in nuclear statistics

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    The modern statistical distribution theory is applied to the development of the overdispersion theory in ionizing-radiation statistics for the first time. The physical nuclear system is treated as a sequence of binomial processes, each depending on a characteristic probability, such as probability of decay, detection, etc. The probabilities fluctuate in the course of a measurement, and the physical reasons for that are discussed. If the average values of the probabilities change from measurement to measurement, which originates from the random Lexis binomial sampling scheme, then the resulting distribution is overdispersed. The generating functions and probability distribution functions are derived, followed by a moment analysis. The Poisson and Gaussian limits are also given. The distribution functions belong to a family of generalized hypergeometric factorial moment distributions by Kemp and Kemp, and can serve as likelihood functions for the statistical estimations. An application to radioactive decay with detection is described and working formulae are given, including a procedure for testing the counting data for overdispersion. More complex experiments in nuclear physics (such as solar neutrino) can be handled by this model, as well as distinguishing between the source and background