Components of the Pearson-Fisher chi-squared statistic
Directory of Open Access Journals (Sweden)
G. D. Raynery
2002-01-01
interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.
International Nuclear Information System (INIS)
Hofland, G.S.; Barton, C.C.
1990-01-01
The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program's results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
Ugoni, Antony; Walker, Bruce F.
1995-01-01
The Chi square test is a statistical test which measures the association between two categorical variables. A working knowledge of tests of this nature are important for the chiropractor and osteopath in order to be able to critically appraise the literature.
A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach
Directory of Open Access Journals (Sweden)
Fang-Yie Leu
2010-04-01
Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.
Directory of Open Access Journals (Sweden)
Antônio Fernando Branco Costa
2005-08-01
Full Text Available Dois gráficos de controle são, usualmente, utilizados no monitoramento da média e da variância de um processo. Em geral, utiliza-se o gráfico de "Xbarra" para a detecção de alterações da média, e o gráfico de R para a sinalização de aumentos da variabilidade. Neste artigo, propõe-se o uso de uma única estatística e, portanto, de um único gráfico, como alternativa à prática comum do monitoramento de processos por meio de dois gráficos de controle. O gráfico proposto, baseado na estatística de Qui-quadrado não-central, tem se mostrado mais eficiente que os gráficos de "Xbarra" e R. Além disso, se as decisões sobre as condições dos parâmetros do processo são baseadas no histórico das observações e não apenas na última observação, então o uso da estatística de Qui-quadrado não-central é indicado para a detecção de pequenas perturbações. Neste estudo, são também apresentados os gráficos de controle da média móvel ponderada exponencialmente (EWMA baseados na estatística Qui-quadrado não-central.It is standard practice to use joint charts in process control, one designed to detect shifts in the mean and the other to detect changes in the variance of the process. In this paper, we propose the use of a single chart to control both mean and variance. Based on the noncentral chi square statistic, the single chart is faster in detecting shifts in the mean and increases in variance than its competitor, the joint "Xbar" and R charts. The noncentral chi square statistic can also be used with the EWMA procedure, particularly in the detection of small mean shifts, accompanied or not by slight increases in variance.
Chi-square test and its application in hypothesis testing
Directory of Open Access Journals (Sweden)
Rakesh Rana
2015-01-01
Full Text Available In medical research, there are studies which often collect data on categorical variables that can be summarized as a series of counts. These counts are commonly arranged in a tabular format known as a contingency table. The chi-square test statistic can be used to evaluate whether there is an association between the rows and columns in a contingency table. More specifically, this statistic can be used to determine whether there is any difference between the study groups in the proportions of the risk factor of interest. Chi-square test and the logic of hypothesis testing were developed by Karl Pearson. This article describes in detail what is a chi-square test, on which type of data it is used, the assumptions associated with its application, how to manually calculate it and how to make use of an online calculator for calculating the Chi-square statistics and its associated P-value.
Chi-squared goodness of fit tests with applications
Balakrishnan, N; Nikulin, MS
2013-01-01
Chi-Squared Goodness of Fit Tests with Applications provides a thorough and complete context for the theoretical basis and implementation of Pearson's monumental contribution and its wide applicability for chi-squared goodness of fit tests. The book is ideal for researchers and scientists conducting statistical analysis in processing of experimental data as well as to students and practitioners with a good mathematical background who use statistical methods. The historical context, especially Chapter 7, provides great insight into importance of this subject with an authoritative author team
Biermann, Carol
1988-01-01
Described is a study designed to introduce students to the behavior of common invertebrate animals, and to use of the chi-square statistical technique. Discusses activities with snails, pill bugs, and mealworms. Provides an abbreviated chi-square table and instructions for performing the experiments and statistical tests. (CW)
Chi-squared: A simpler evaluation function for multiple-instance learning
National Research Council Canada - National Science Library
McGovern, Amy; Jensen, David
2003-01-01
...) but finds the best concept using the chi-square statistic. This approach is simpler than diverse density and allows us to search more extensively by using properties of the contingency table to prune in a guaranteed manner...
Applied noncentral Chi-squared distribution in CFAR detection of hyperspectral projected images
Li, Zhiyong; Chen, Dong; Shi, Gongtao; Yang, Guopeng; Wang, Gang
2015-10-01
In this paper, the noncentral chi-squared distribution is applied in the Constant False Alarm Rate (CFAR) detection of hyperspectral projected images to distinguish the anomaly points from background. Usually, the process of the hyperspectral anomaly detectors can be considered as a linear projection. These operators are linear transforms and their results are quadratic form which comes from the transform of spectral vector. In general, chi-squared distribution could be the proper choice to describe the statistical characteristic of this projected image. However, because of the strong correlation among the bands, the standard central chi-squared distribution often cannot fit the stochastic characteristic of the projected images precisely. In this paper, we use a noncentral chi-squared distribution to approximate the projected image of subspace based anomaly detectors. Firstly, the statistical modal of the projected multivariate data is analysed, and a noncentral chi-squared distribution is deduced. Then, the approach of the parameters calculation is introduced. At last, the aerial hyperspectral images are used to verify the effectiveness of the proposed method in tightly modeling the projected image statistic distribution.
SPECT electronic collimation resolution enhancement using chi-square minimization
International Nuclear Information System (INIS)
Durkee, J.W. Jr.; Antich, P.P.; Tsyganov, E.N.; Constantinescu, A.; Fernando, J.L.; Kulkarni, P.V.; Smith, B.J.; Arbique, G.M.; Lewis, M.A.; Nguyen, T.; Raheja, A.; Thambi, G.; Parkey, R.W.
1998-01-01
An electronic collimation technique is developed which utilizes the chi-square goodness-of-fit measure to filter scattered gammas incident upon a medical imaging detector. In this data mining technique, Compton kinematic expressions are used as the chi-square fitting templates for measured energy-deposition data involving multiple-interaction scatter sequences. Fit optimization is conducted using the Davidon variable metric minimization algorithm to simultaneously determine the best-fit gamma scatter angles and their associated uncertainties, with the uncertainty associated with the first scatter angle corresponding to the angular resolution precision for the source. The methodology requires no knowledge of materials and geometry. This pattern recognition application enhances the ability to select those gammas that will provide the best resolution for input to reconstruction software. Illustrative computational results are presented for a conceptual truncated-ellipsoid polystyrene position-sensitive fibre head-detector Monte Carlo model using a triple Compton scatter gamma sequence assessment for a 99m Tc point source. A filtration rate of 94.3% is obtained, resulting in an estimated sensitivity approximately three orders of magnitude greater than a high-resolution mechanically collimated device. The technique improves the nominal single-scatter angular resolution by up to approximately 24 per cent as compared with the conventional analytic electronic collimation measure. (author)
Chi-square-based scoring function for categorization of MEDLINE citations.
Kastrin, A; Peterlin, B; Hristovski, D
2010-01-01
Text categorization has been used in biomedical informatics for identifying documents containing relevant topics of interest. We developed a simple method that uses a chi-square-based scoring function to determine the likelihood of MEDLINE citations containing genetic relevant topic. Our procedure requires construction of a genetic and a nongenetic domain document corpus. We used MeSH descriptors assigned to MEDLINE citations for this categorization task. We compared frequencies of MeSH descriptors between two corpora applying chi-square test. A MeSH descriptor was considered to be a positive indicator if its relative observed frequency in the genetic domain corpus was greater than its relative observed frequency in the nongenetic domain corpus. The output of the proposed method is a list of scores for all the citations, with the highest score given to those citations containing MeSH descriptors typical for the genetic domain. Validation was done on a set of 734 manually annotated MEDLINE citations. It achieved predictive accuracy of 0.87 with 0.69 recall and 0.64 precision. We evaluated the method by comparing it to three machine-learning algorithms (support vector machines, decision trees, naïve Bayes). Although the differences were not statistically significantly different, results showed that our chi-square scoring performs as good as compared machine-learning algorithms. We suggest that the chi-square scoring is an effective solution to help categorize MEDLINE citations. The algorithm is implemented in the BITOLA literature-based discovery support system as a preprocessor for gene symbol disambiguation process.
Measures of effect size for chi-squared and likelihood-ratio goodness-of-fit tests.
Johnston, Janis E; Berry, Kenneth J; Mielke, Paul W
2006-10-01
A fundamental shift in editorial policy for psychological journals was initiated when the fourth edition of the Publication Manual of the American Psychological Association (1994) placed emphasis on reporting measures of effect size. This paper presents measures of effect size for the chi-squared and the likelihood-ratio goodness-of-fit statistic tests.
Long, Michael A; Berry, Kenneth J; Mielke, Paul W
2010-10-01
Monte Carlo resampling methods to obtain probability values for chi-squared and likelihood-ratio test statistics for multiway contingency tables are presented. A resampling algorithm provides random arrangements of cell frequencies in a multiway contingency table, given fixed marginal frequency totals. Probability values are obtained from the proportion of resampled test statistic values equal to or greater than the observed test statistic value.
One implementation of the Chi Square Test with SPSS
Tinoco Gómez, Oscar
2014-01-01
Chi Cuadrado illustrates the use of the statistical software SPSS applied to the test to prove independence between two variables. The application carries out in the evaluation of the impact generated in the educational page of the Faculty of Administrative Sciences of the National University Federico Villarreal in relation to the use of some of the tools of the technologies called of information and communication in the process of formation profesional. Se ilustra el uso del software esta...
Directory of Open Access Journals (Sweden)
Maysa Sacramento de Magalhães
2011-06-01
Full Text Available Production processes are monitored by control charts since their inception by Shewhart (1924. This surveillance is useful in improving the production process due to increased stabilization of the process, and consequently standardization of the output. Control charts keep track of a few key quality characteristics of the outcome of the production process. This is done by means of univariate or multivariate charts. Small improvements in control chart methodology can have significant economic impact in the production process. In this investigation, we propose the monitoring of a single variable by means of a variable parameter non-central chi-square control chart. The design of the chart is accomplished by means of optimizing a cost function. We use here a simulated annealing optimization tool, due to the difficulty of classical gradient based optimization techniques to handle the optimization of the cost function. The results show some of the drawbacks of using this model.Processos de produção são monitorados por gráficos de controle desde a sua introdução por Shewhart (1924. Este monitoramento é útil na melhoria do processo de produção devido à crescente estabilização do processo, e consequentemente, padronização do produto. Gráficos de controle mantêm vigilância de características de qualidade de um processo de produção. Isto é feito por intermédio de gráficos univariados ou multivariados. Melhorias na metodologia de gráficos de controle podem levar a um impacto econômico significativo no processo de produção. Neste artigo, propomos um gráfico de controle de parâmetros variáveis baseado na estatística qui-quadrado nãocentral para monitorar uma característica de qualidade de interesse. O projeto do gráfico é realizado através da otimização de uma função custo. O algoritmo simulated annealing é usado devido à dificuldade dos métodos clássicos de otimização baseados no gradiente, de lidarem com a
Calibration of Self-Efficacy for Conducting a Chi-Squared Test of Independence
Zimmerman, Whitney Alicia; Goins, Deborah D.
2015-01-01
Self-efficacy and knowledge, both concerning the chi-squared test of independence, were examined in education graduate students. Participants rated statements concerning self-efficacy and completed a related knowledge assessment. After completing a demographic survey, participants completed the self-efficacy and knowledge scales a second time.…
False star detection and isolation during star tracking based on improved chi-square tests.
Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Yang, Yanqiang; Su, Guohua
2017-08-01
The star sensor is a precise attitude measurement device for a spacecraft. Star tracking is the main and key working mode for a star sensor. However, during star tracking, false stars become an inevitable interference for star sensor applications, which may result in declined measurement accuracy. A false star detection and isolation algorithm in star tracking based on improved chi-square tests is proposed in this paper. Two estimations are established based on a Kalman filter and a priori information, respectively. The false star detection is operated through adopting the global state chi-square test in a Kalman filter. The false star isolation is achieved using a local state chi-square test. Semi-physical experiments under different trajectories with various false stars are designed for verification. Experiment results show that various false stars can be detected and isolated from navigation stars during star tracking, and the attitude measurement accuracy is hardly influenced by false stars. The proposed algorithm is proved to have an excellent performance in terms of speed, stability, and robustness.
Practical Statistics for Particle Physics Analyses: Chi-Squared and Goodness of Fit (2/4)
CERN. Geneva; Moneta, Lorenzo
2016-01-01
This will be a 4-day series of 2-hour sessions as part of CERN's Academic Training Course. Each session will consist of a 1-hour lecture followed by one hour of practical computing, which will have exercises based on that day's lecture. While it is possible to follow just the lectures or just the computing exercises, we highly recommend that, because of the way this course is designed, participants come to both parts. In order to follow the hands-on exercises sessions, students need to bring their own laptops. The exercises will be run on a dedicated CERN Web notebook service, SWAN (swan.cern.ch), which is open to everybody holding a CERN computing account. The requirement to use the SWAN service is to have a CERN account and to have also access to Cernbox, the shared storage service at CERN. New users of cernbox are invited to activate beforehand cernbox by simply connecting to https://cernbox.cern.ch. A basic prior knowledge of ROOT and C++ is also recommended for participation in the practical session....
Pg Haji Mohd Ariffin, Ak Muhamad Amirul Irfan
2015-01-01
This paper presents the project that I have been tasked while attending a three-month Summer Programme at CERN. The Project specification is to analyse the result of a weekly data produced by Compact Muon Solenoid (CMS) in the form of histograms. CMS is a detector which is a multi-purpose apparatus use to operate at the Large Hadron Collider (LHC) at CERN. It will yield head-on collisions of two proton (ion) beams of 7 TeV (2.75 TeV per nucleon) each, with a design luminosity of 10 34 cm -2s-1. A comparison of the results is then made using two methods namely Kolmogorov Smirnov Statistic Test and Chi-Squared Test. These tests will be further elaborated in the subsequent paragraphs. To execute this project, I have to firstly study the entire basic computer programming in particular C++ and the ROOT Basic Programmes. This is important to ensure the tasks given can be resolved within the given time. A program is subsequently written to provide output of histogram and calculation of Kolmogorov-Smirnov Test and Ch...
Jet pairing algorithm for the 6-jet Higgs channel via energy chi-square criterion
International Nuclear Information System (INIS)
Magallanes, J.B.; Arogancia, D.C.; Gooc, H.C.; Vicente, I.C.M.; Bacala, A.M.; Miyamoto, A.; Fujii, K.
2002-01-01
Study and discovery of the Higgs bosons at JLC (Joint Linear Collider) is one of the tasks of ACFA (Asian Committee for future Accelerators)-JLC Group. The mode of Higgs production at JLC is e + e - → Z 0 H 0 . In this paper, studies are concentrated on the Higgsstrahlung process and the selection of its signals by getting the right jet-pairing algorithm of 6-jet final state at 300 GeV assuming that Higgs boson mass is 120 GeV and luminosity is 500 fb -1 . The total decay width Γ (H 0 → all) and the efficiency of the signals at the JLC are studied utilizing the 6-jet channel. Out of the 91,500 Higgsstrahlung events, 4,174 6-jet events are selected. PYTHIA Monte Carlo Generator generates the 6-jet Higgsstrahlung channel according to the Standard Model. The generated events are then simulated by Quick Simulator using the JCL parameters. After tagging all 6 quarks which correspond to the 6-jet final state of the Higgsstrahlung, the mean energy of the Z, H, and W's are obtained. Having calculated these information, the event energy chi-square is defined and it is found that the correct combination have generally smaller value. This criterion can be used to find correct jet-pairing algorithm and as one of the cuts for the background signals later on. Other chi-definitions are also proposed. (S. Funahashi)
Association between litterers' profile and littering behavior: A chi-square approach
Asmui, Mas'udah; Zaki, Suhanom Mohd; Wahid, Sharifah Norhuda Syed; Mokhtar, Noorsuraya Mohd; Harith, Siti Suhaila
2017-05-01
Littering is not a novelty, yet a prolonged issue. The solutions have been discussed for a long time; however this issue still remains unresolved. Littering is commonly associated with littering behavior and awareness. The littering behavior is normally influenced by the litter profile such as gender, family income, education level and age. Jengka Street market, which is located in Pahang, is popularly known as a trade market. It offers diversities of wet and dry goods and is awaited by local residents and tourists. This study analyzes association between litterers' profile and littering behavior. Littering behavior is measured based on factors of trash bin facilities, awareness campaign and public littering behavior. 114 respondents were involved in this study with 62 (54.39%) are female aged more than 18 years old and majority of these female respondents are diploma holders. In addition, 78.95% of the respondents have family income below than RM3,000.00 per month. Based on the data analysis, it was found that first-time visitors littered higher than frequent visitors, lack of providing trash bin facilities contributes to positive littering behavior and there is a significant association between litterers' age and littering behavior by using chi-square approach.
Directory of Open Access Journals (Sweden)
Yang Zu
2015-07-01
Full Text Available This paper studies the asymptotic normality for the kernel deconvolution estimator when the noise distribution is logarithmic chi-square; both identical and independently distributed observations and strong mixing observations are considered. The dependent case of the result is applied to obtain the pointwise asymptotic distribution of the deconvolution volatility density estimator in discrete-time stochastic volatility models.
Directory of Open Access Journals (Sweden)
Yelda ŞENER
2014-12-01
Full Text Available The purpose of this study, using data provided from 223 inpatients in a teaching and research hospital, hospital’s preference is to explain the effect of word of mouth marketing. For this purpose, word of mouth marketing process is evaluated in terms of providing information about the hospital and the patient’s level of intimacy, both of patients and information provider’s level of expertise with related to hospital and services, the patient’s perceived level of risk for hospitals and services and providing information’s level of impact on patient being treated in hospital. The obtain data, after evaluation by frequency distributions these factors impact on word of mouth marketing is demonstrated by descriptive statistics, chi-square analysis and pearson’s correlation analysis. As a result of this study is concluded word of mouth marketing on the training and research hospital is preferred by the patints to have a significant impact.
International Nuclear Information System (INIS)
Althuwaynee, Omar F; Pradhan, Biswajeet; Ahmad, Noordin
2014-01-01
This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies
Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu
2015-01-01
To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.
Chi-Square Discriminators for Transiting Planet Detection in Kepler Data
Seader, Shawn; Tenenbaum, Peter; Jenkins, Jon M.; Burke, Christopher J.
2013-01-01
The Kepler spacecraft observes a host of target stars to detect transiting planets. Requiring a 7.1 sigma detection in twelve quarters of data yields over 100,000 detections, many of which are false alarms. After a second cut is made on a robust detection statistic, some 50,000 or more targets still remain. These false alarms waste resources as they propagate through the remainder of the software pipeline and so a method to discriminate against them is crucial in maintaining the desired sensi...
Brace, Jordan Campbell; Savalei, Victoria
2017-09-01
A Monte Carlo simulation study was conducted to investigate Type I error rates and power of several corrections for nonnormality to the normal theory chi-square difference test in the context of evaluating measurement invariance via structural equation modeling. Studied statistics include the uncorrected difference test, D ML , Satorra and Bentler's (2001) original correction, D SB1 , Satorra and Bentler's (2010) strictly positive correction, D SB10 , and a hybrid procedure, D SBH (Asparouhov & Muthén, 2013). Multiple-group data were generated from confirmatory factor analytic population models invariant on all parameters, or lacking invariance on residual variances, indicator intercepts, or factor loadings. Conditions varied in terms of the number of indicators associated with each factor in the population model, the location of noninvariance (if any), sample size, sample size ratio in the 2 groups, and nature of nonnormality. Type I error rates and power of corrected statistics were evaluated for a series of 4 nested invariance models. Overall, the strictly positive correction, D SB10 , is the best and most consistently performing statistic, as it was found to be much less sensitive than the original correction, D SB1 , to model size and sample evenness. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Al-sharif, Abubakr A. A.; Pradhan, Biswajeet; Zulhaidi Mohd Shafri, Helmi; Mansor, Shattri
2014-06-01
Urban expansion is a spatial phenomenon that reflects the increased level of importance of metropolises. The remotely sensed data and GIS have been widely used to study and analyze the process of urban expansions and their patterns. The capital of Libya (Tripoli) was selected to perform this study and to examine its urban growth patterns. Four satellite imageries of the study area in different dates (1984, 1996, 2002 and 2010) were used to conduct this research. The main goal of this work is identification and analyzes the urban sprawl of Tripoli metropolitan area. Urban expansion intensity index (UEII) and degree of freedom test were used to analyze and assess urban expansions in the area of study. The results show that Tripoli has sprawled urban expansion patterns; high urban expansion intensity index; and its urban development had high degree of freedom according to its urban expansion history during the time period (1984-2010). However, the novel proposed hypothesis used for zones division resulted in very good insight understanding of urban expansion direction and the effect of the distance from central business of district (CBD).
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-05-20
In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.
Directory of Open Access Journals (Sweden)
Francisco A Bosco
2012-12-01
Full Text Available Since the foundations of Population Genetics the notion of genetic equilibrium (in close analogy to Classical Mechanics has been associated with the Hardy-Weinberg (HW Principle and the identification of equilibrium is currently assumed by stating that the HW axioms are valid if appropriate values of Chi-Square (p<0.05 are observed in experiments. Here we show by numerical experiments with the genetic system of one locus/two alleles that considering large ensembles of populations the Chi-Square test is not decisive and may lead to false negatives in random mating populations and false positives in nonrandom mating populations. This result confirms the logical statement that statistical tests cannot be used to deduce if the genetic population is under the HW conditions. Furthermore, we show that under the HW conditions populations of any size evolve in time according to what can be identified as neutral dynamics to which the very notion of equilibrium is unattainable for any practical purpose. Therefore, under the HW conditions the identification of equilibrium properties needs a different approach and the use of more appropriate concepts. We also show that by relaxing the condition of random mating the dynamics acquires all the characteristics of asymptotic stable equilibrium. As a consequence our results show that the question of equilibrium in genetic systems should be approached in close analogy to non-equilibrium statistical physics and its observability should be focused on dynamical quantities like the typical decay properties of the allelic auto correlation function in time. In this perspective one should abandon the classical notion of genetic equilibrium and its relation to the HW proportions and open investigations in the direction of searching for unifying general principles of population genetic transformations capable to take in consideration these systems in their full complexity.
Energy Technology Data Exchange (ETDEWEB)
Manungu Kiveni, Joseph [Syracuse Univ., NY (United States)
2012-12-01
This dissertation describes the results of a WIMP search using CDMS II data sets accumulated at the Soudan Underground Laboratory in Minnesota. Results from the original analysis of these data were published in 2009; two events were observed in the signal region with an expected leakage of 0.9 events. Further investigation revealed an issue with the ionization-pulse reconstruction algorithm leading to a software upgrade and a subsequent reanalysis of the data. As part of the reanalysis, I performed an advanced discrimination technique to better distinguish (potential) signal events from backgrounds using a 5-dimensional chi-square method. This dataanalysis technique combines the event information recorded for each WIMP-search event to derive a backgrounddiscrimination parameter capable of reducing the expected background to less than one event, while maintaining high efficiency for signal events. Furthermore, optimizing the cut positions of this 5-dimensional chi-square parameter for the 14 viable germanium detectors yields an improved expected sensitivity to WIMP interactions relative to previous CDMS results. This dissertation describes my improved (and optimized) discrimination technique and the results obtained from a blind application to the reanalyzed CDMS II WIMP-search data.
Directory of Open Access Journals (Sweden)
Manoel Vitor de Souza Veloso
2016-04-01
Full Text Available Current study employs Monte Carlo simulation in the building of a significance test to indicate the principal components that best discriminate against outliers. Different sample sizes were generated by multivariate normal distribution with different numbers of variables and correlation structures. Corrections by chi-square distance of Pearson´s and Yates's were provided for each sample size. Pearson´s correlation test showed the best performance. By increasing the number of variables, significance probabilities in favor of hypothesis H0 were reduced. So that the proposed method could be illustrated, a multivariate time series was applied with regard to sales volume rates in the state of Minas Gerais, obtained in different market segments.
Hawkins, Donovan Lee
2005-01-01
In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.
Adeniyi, D A; Wei, Z; Yang, Y
2018-01-30
A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.
Energy Technology Data Exchange (ETDEWEB)
Conover, W.J. [Texas Tech Univ., Lubbock, TX (United States); Cox, D.D. [Rice Univ., Houston, TX (United States); Martz, H.F. [Los Alamos National Lab., NM (United States)
1997-12-01
When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica{reg_sign} computer programs which are provided.
International Nuclear Information System (INIS)
Conover, W.J.; Cox, D.D.; Martz, H.F.
1997-12-01
When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica reg-sign computer programs which are provided
International Nuclear Information System (INIS)
Garo Balian, H.; Eddy, N.W.
1977-01-01
A careful experimenter knows that in order to choose the best curve fits of peaks from a gamma ray spectrum for such purposes as energy or intensity calibration, half-life determination, etc., the application of the normalized chi-squared test, [chisub(N)] 2 =chi 2 /(n-m), is insufficient. One must normally verify the goodness-of-fit with plots, detailed scans of residuals, etc. Because of different techniques of application, variations in backgrounds, in peak sizes and shapes, etc., quotation of the [chisub(N)] 2 value associated with an individual peak fit conveys very little information unless accompanied by considerable ancillary data. (This is not to say that the traditional chi 2 formula should not be used as the source of the normal equations in the least squares fitting procedure. But after the fitting, it is unreliable as a criterion for comparison with other fits.) The authors present a formula designated figure-of-merit (FOM) which greatly improves on the uncertainty and fluctuations of the [chisub(N)] 2 formula. An FOM value of less than 2.5% indicates a good fit (in the authors' judgement) irrespective of background conditions and variations in peak sizes and shapes. Furthermore, the authors feel the FOM formula is less subject to fluctuations resulting from different techniques of application. (Auth.)
International Nuclear Information System (INIS)
Al-sharif, Abubakr A A; Pradhan, Biswajeet; Shafri, Helmi Zulhaidi Mohd; Mansor, Shattri
2014-01-01
Urban expansion is a spatial phenomenon that reflects the increased level of importance of metropolises. The remotely sensed data and GIS have been widely used to study and analyze the process of urban expansions and their patterns. The capital of Libya (Tripoli) was selected to perform this study and to examine its urban growth patterns. Four satellite imageries of the study area in different dates (1984, 1996, 2002 and 2010) were used to conduct this research. The main goal of this work is identification and analyzes the urban sprawl of Tripoli metropolitan area. Urban expansion intensity index (UEII) and degree of freedom test were used to analyze and assess urban expansions in the area of study. The results show that Tripoli has sprawled urban expansion patterns; high urban expansion intensity index; and its urban development had high degree of freedom according to its urban expansion history during the time period (1984-2010). However, the novel proposed hypothesis used for zones division resulted in very good insight understanding of urban expansion direction and the effect of the distance from central business of district (CBD)
International Nuclear Information System (INIS)
1988-01-01
Before 26 April 1986 few people in the west had heard of Chernobyl. Then Chernobyl experienced the world's worst nuclear power station accident. In the wake of the disaster radioactivity fell on Britain and much of Europe. There was confusion and rumour on the television, in the papers and amongst ordinary people. What would the effect of the Chernobyl accident be? Was it safe to go out of doors? Was it safe to eat fresh vegetables? What was a safe level of radiation? What was a becquerel, a milliSievert or any of the other scientific terms with which we were bombarded by scientists and other experts? This booklet sets out to help answer these questions by looking at a hypothetical disaster at the nuclear power station at Heysham, near Morecambe in Lancashire. Using this scenario it shows what the worst consequences of a nuclear accident might be for the citizens of Leeds. It also explains in a straightforward way the meaning of many technical terms which will help you to understand the advice and comments of experts and to make your own judgement of what they say. (author)
Energy Efficiency: Comparison between GREENSHIP and LEED
Baharuddin; Rahim, Ramli
2011-01-01
This paper compares the energy efficiency in the two green building rating tools i.e. GREENSHIP and LEED. The study has been carried out by comparing the energy performance standard and the energy calculation method of both rating tools. GREENSHIP uses the OTTV (overall thermal transfer value) to measure the efficiency of energy use of the building design, while LEED uses ASHRAE standard for baseline building. The result shows that the energy standard uses in LEED rating tool is more stringen...
Directory of Open Access Journals (Sweden)
Johan Mardini
2017-07-01
Full Text Available Dado que la información se ha constituido en uno de los activos más valiosos de las organizaciones, es necesario salvaguardarla a través de diferentes estrategias de protección, con el fin de evitar accesos intrusivos o cualquier tipo de incidente que cause el deterioro y mal uso de la misma. Precisamente por ello, en este artículo se evalúa la eficiencia de un modelo de detección de intrusiones de red, utilizando métricas de sensibilidad, especificidad, precisión y exactitud, mediante un proceso de simulación que utiliza el DATASET NSL-KDD DARPA, y en concreto las características más relevantes con CHI SQUARE. Esto último a partir de una red neuronal que hace uso de un algoritmo de aprendizaje no supervisado y que se basa en mapas auto organizativos jerárquicos. Con todo ello se clasificó el tráfico de la red BI-CLASE de forma automática. Como resultado se encontró que el clasificador GHSOM utilizado con la técnica CHI SQUARE genera su mejor resultado a 15 características con precisión, sensibilidad, especificidad y exactitud
LEED Credit Review System and Optimization Model for Pursuing LEED Certification
Directory of Open Access Journals (Sweden)
Jin Ouk Choi
2015-09-01
Full Text Available Incorporating sustainability in construction can result in desirable building attributes and project life cycle. The Leadership in Engineering and Environmental Design (LEED® Rating System helps project teams make the right green building decisions for their projects through a process. However, in current practice, project teams do not have a systematic procedure or tool for choosing the LEED credits appropriate for a particular project. The researchers have developed a tool, which support the LEED integrative process during a charrette, and developed an optimization model that can be utilized to assist project teams determine which credits to pursue for LEED certification, taking into account potential benefits associated with any LEED credit. The tool enables owners to incorporate sustainability in construction by helping the project teams make the right green building decisions for their projects through an integrated procedure.
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Practical statistics for particle physicists
CERN. Geneva
2006-01-01
Learning to love the errror matrix lecture : Learning to love the errror matrix Introductory remarks. Conditional probability. Statistical and systematic errors. Combining results Binomial, Poisson and 1-D Gaussian 2-D Gaussian and the error matrix. Understanding the covariance. Using the error matrix. Estimating the error matrix. Combining correlated measurements Parameter determination by likelihood Do's and don'ts lecture : Parameter determination by likelihood : Do's and don'ts Introduction to likelihood. Error estimate. Simple examples: (1) Breit Wigner (2) Lifetime Binned and unbinned likelihood Several parameters Extended maximum likelihood. Common misapprehensions: Normalisation delta(lnL) = 1/2 rule and coverage Integrating the likelihood Unbinned L_max as goodness of fit Punzi effect Chi-squared and hypothesis testing lecture : Chi-squared and hypothesis testing Basic idea. Error estimates. Several parameters Correlated errors on y. Errors on x and y. Goodness of fit. Degrees of freedom. Why assympt...
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se
Cost analysis of LEED certified United States navy buildings
Kirar, Carl V.
2011-01-01
CIVINS (Civilian Institutions) Thesis document A study was completed at UW-Madison in 2010 that reviewed the energy consumption of US Navy buildings which earned Leadership in Energy and Environmental Design (LEED) certification by the United States Green Building Council (USGBC). The research compared LEED certified buildings to a commercial counterpart within the US Navy inventory against Executive Order (EO) 13423. The EO mandated that all federal agencies meet a 30 percent reduction of...
Reliability of contemporary data-acquisition techniques for LEED analysis
International Nuclear Information System (INIS)
Noonan, J.R.; Davis, H.L.
1980-10-01
It is becoming clear that one of the principal limitations in LEED structure analysis is the quality of the experimental I-V profiles. This limitation is discussed, and data acquisition procedures described, which for simple systems, seem to enhance the quality of agreement between the results of theoretical model calculations and experimental LEED spectra. By employing such procedures to obtain data from Cu(100), excellent agreement between computed and measured profiles has been achieved. 7 figures
Do LEED-certified buildings save energy? Yes, but...
Energy Technology Data Exchange (ETDEWEB)
Newsham, Guy R.; Mancini, Sandra; Birt, Benjamin J. [National Research Council Canada - Institute for Research in Construction, Ottawa (Canada)
2009-08-15
We conducted a re-analysis of data supplied by the New Buildings Institute and the US Green Buildings Council on measured energy use data from 100 LEED-certified commercial and institutional buildings. These data were compared to the energy use of the general US commercial building stock. We also examined energy use by LEED certification level, and by energy-related credits achieved in the certification process. On average, LEED buildings used 18-39% less energy per floor area than their conventional counterparts. However, 28-35% of LEED buildings used more energy than their conventional counterparts. Further, the measured energy performance of LEED buildings had little correlation with certification level of the building, or the number of energy credits achieved by the building at design time. Therefore, at a societal level, green buildings can contribute substantial energy savings, but further work needs to be done to define green building rating schemes to ensure more consistent success at the individual building level. Note, these findings should be considered as preliminary, and the analyses should be repeated when longer data histories from a larger sample of green buildings are available. (author)
Designing healthy communities: A walkability analysis of LEED-ND
Directory of Open Access Journals (Sweden)
Adriana A. Zuniga-Teran
2016-12-01
Full Text Available Prevailing city design in many countries has created sedentary societies that depend on automobile use. Consequently, architects, urban designers, and land planners have developed new urban design theories, which have been incorporated into the Leadership in Energy and Environmental Design for Neighborhood Development (LEED-ND certification system. The LEED-ND includes design elements that improve human well-being by facilitating walking and biking, a concept known as walkability. Despite these positive developments, relevant research findings from other fields of study have not been fully integrated into the LEED-ND. According to Zuniga-Teran (2015, relevant walkability research findings from multiple disciplines were organized into a walkability framework (WF that organizes design elements related to physical activity into nine categories, namely, connectivity, land use, density, traffic safety, surveillance, parking, experience, greenspace, and community. In this study, we analyze walkability in the LEED-ND through the lens of the nine WF categories. Through quantitative and qualitative analyses, we identify gaps and strengths in the LEED-ND and propose potential enhancements to this certification system that reflects what is known about enhancing walkability more comprehensively through neighborhood design analysis. This work seeks to facilitate the translation of research into practice, which can ultimately lead to more active and healthier societies.
Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C
2014-12-01
It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.
Energy Technology Data Exchange (ETDEWEB)
Ogletree, D.F.
1986-11-01
LEED multiple scattering theory is briefly summarized, and aspects of electron scattering with particular significance to experimental measurements such as electron beam coherence, instrument response and phonon scattering are analyzed. Diffuse LEED experiments are discussed. New techniques that enhance the power of LEED are described, including a real-time video image digitizer applied to LEED intensity measurements, along with computer programs to generate I-V curves. The first electron counting LEED detector using a ''wedge and strip'' position sensitive anode and digital electronics is described. This instrument uses picoampere incident beam currents, and its sensitivity is limited only by statistics and counting times. Structural results on new classes of surface systems are presented. The structure of the c(4 x 2) phase of carbon monoxide adsorbed on Pt(111) has been determined, showing that carbon monoxide molecules adsorb in both top and bridge sites, 1.85 +- 0.10 A and 1.55 +- 0.10 A above the metal surface, respectively. The structure of an incommensurate graphite overlayer on Pt(111) is analyzed. The graphite layer is 3.70 +- 0.05 A above the metal surface, with intercalated carbon atoms located 1.25 +- 0.10 A above hollow sites supporting it. The (2..sqrt..3 x 4)-rectangular phase of benzene and carbon monoxide coadsorbed on Pt(111) is analyzed. Benzene molecules adsorb in bridge sites parallel to and 2.10 +- 0.10 A above the surface. The carbon ring is expanded, with an average C-C bond length of 1.72 +- 0.15 A. The carbon monoxide molecules also adsorb in bridge sites. The structure of the (..sqrt..3 x ..sqrt..3) reconstruction on the (111) face of the ..cap alpha..-CuAl alloy has been determined.
Automated fenestration allocation as complying with LEED rating system
Directory of Open Access Journals (Sweden)
Hazem Mohamed Talaat El Daly
2014-12-01
The allocation of windows, through the help of certain well known heuristic algorithms and simulation programs, could be reached automatically to compromise with the LEED rating system by achieving the required daylight amounts with a minimum solar radiation inside a particular building. This research shows a design method based on simulation techniques with the help of heuristic algorithms through a parametric design that automatically allocate windows to comply with LEED. At the end of the research, a small project is discussed for evaluating the design process.
Quantum chi-squared and goodness of fit testing
Energy Technology Data Exchange (ETDEWEB)
Temme, Kristan [IQIM, California Institute of Technology, Pasadena, California 91125 (United States); Verstraete, Frank [Fakultät für Physik, Universität Wien, Boltzmanngasse 5, 1090 Wien, Austria and Faculty of Science, Ghent University, B-9000 Ghent (Belgium)
2015-01-15
A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fit test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.
Residuals and the Residual-Based Statistic for Testing Goodness of Fit of Structural Equation Models
Foldnes, Njal; Foss, Tron; Olsson, Ulf Henning
2012-01-01
The residuals obtained from fitting a structural equation model are crucial ingredients in obtaining chi-square goodness-of-fit statistics for the model. The authors present a didactic discussion of the residuals, obtaining a geometrical interpretation by recognizing the residuals as the result of oblique projections. This sheds light on the…
Statistical power of likelihood ratio and Wald tests in latent class models with covariates
Gudicha, D.W.; Schmittmann, V.D.; Vermunt, J.K.
2017-01-01
This paper discusses power and sample-size computation for likelihood ratio and Wald testing of the significance of covariate effects in latent class models. For both tests, asymptotic distributions can be used; that is, the test statistic can be assumed to follow a central Chi-square under the null
Roberts, James S.
Stone and colleagues (C. Stone, R. Ankenman, S. Lane, and M. Liu, 1993; C. Stone, R. Mislevy and J. Mazzeo, 1994; C. Stone, 2000) have proposed a fit index that explicitly accounts for the measurement error inherent in an estimated theta value, here called chi squared superscript 2, subscript i*. The elements of this statistic are natural…
Cost Analysis of Leed Certified United States Navy Buildings
2011-08-03
employee productivity, investors seeking more socially conscious investments, and reputational issues that have been forcing the real estate sector...conscious investments, and reputational issues that have been forcing the real estate sector towards more efficient building techniques. LEED has...2011. 12. Macdonald, N., Cheng, D. Basic Finance for Marketers (Marketing and agribusiness texts - 1). Food and Agriculuture Organization of the
Performance or marketing benefits? The case of LEED certification.
Matisoff, Daniel C; Noonan, Douglas S; Mazzolini, Anna M
2014-01-01
Green building adoption is driven by both performance-based benefits and marketing based benefits. Performance based benefits are those that improve performance or lower operating costs of the building or of building users. Marketing benefits stem from the consumer response to green certification. This study illustrates the relative importance of the marketing based benefits that accrue to Leadership in Energy and Environmental Design (LEED) buildings due to green signaling mechanisms, specifically related to the certification itself are identified. Of course, all participants in the LEED certification scheme seek marketing benefits. But even among LEED participants, the interest in green signaling is pronounced. The green signaling mechanism that occurs at the certification thresholds shifts building patterns from just below to just above the threshold level, and motivates builders to cluster buildings just above each threshold. Results are consistent across subsamples, though nonprofit organizations appear to build greener buildings and engage in more green signaling than for-profit entities. Using nonparametric regression discontinuity, signaling across different building types is observed. Marketing benefits due to LEED certification drives organizations to build "greener" buildings by upgrading buildings at the thresholds to reach certification levels.
Nonparametric statistics for social and behavioral sciences
Kraska-MIller, M
2013-01-01
Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency
Critical review of LEED system for rating sustainability of architecture of commercial interiors
Directory of Open Access Journals (Sweden)
Stevanović Sanja
2010-01-01
Full Text Available The LEED rating system for sustainability of architecture has gained large marketing potential in USA and became one of main ways American builders are attacking ecological challenges. In this paper the LEED rating system for commercial interiors is critically reviewed, pointing out its positive - focus on integrated design process - and negative impacts - low thresholds for highest ratings and tendency to gain LEED rating with projects that hardly pass the thresholds, largely neglecting the principles of energy efficiency. Based on a few prominent LEED platinum examples, the beginnings of a LEED style of designing interiors in historical landmark buildings are pointed out as well.
Achieving LEED credit for ergonomics: Laying the foundation.
Lynch, Mallory
2014-01-01
Despite guidance from the United States Green Building Council (USGBC) on the requirements for earning a Leadership in Energy and Environmental Design (LEED) ergonomics credit in the Innovation in Design and Innovation in Operations category, few projects have received the credit. The University of California, Berkeley ergonomics program, Ergonomics@Work, has aligned the ergonomics strategy to those of the USGBC and LEED to achieve the ergonomics credit in several new buildings. This article describes the steps needed to obtain the credit and highlights the opportunities it creates to partner with the project team to promote ergonomics. As a profession it is up to ergonomists to create the road map that incorporates ergonomics into the green building design.
Rumpel-Leede Phenomenon in a Hypertensive Lady on Amlodipine
Viswanathan, Stalin
2014-01-01
We are describing a 60-year-old hypertensive lady who developed Rumpel-Leede phenomenon following the use of a tourniquet to obtain a blood sample. History revealed that she was on amlodipine therapy and that spontaneous sun-exposure related purpura was often seen since amlodipine was prescribed. Examinations and investigations provided normal results. She refused consent for a skin biopsy. Symptoms resolved after its substitution with enalapril and dihydrochlorothiazide, without any further recurrence. PMID:24959504
A value-for-money solution in Leeds
Energy Technology Data Exchange (ETDEWEB)
Gaunt, M. [United Leeds Teaching Hospitals NHS Trust, Leeds (United Kingdom)
1998-05-01
A contract energy services scheme is described which supplies all the electric power, heating and chilling needs of the United Leeds Hospital and the University of Leeds campus. In order to meet current needs, a major expansion of capacity and reconfiguration of an existing GSC built as a joint venture between Leeds General Infirmary and the University in the 1970s was required. The estimated capital investment for the project was Pound 6.5 M. The decision to develop the project as an energy services scheme was taken in view of the technical complexity requiring project management and engineering skills not available either in the Hospital or the University. It has been successfully implemented and is meeting expectations in terms both of delivery of service and savings. The Hospital and University have avoided the need to obtain and invest capital themselves, the combination of more energy efficient equipment and better use of existing capacity have reduced revenue costs and management time has been reduced. Over the lifetime of the 20-year contract, savings of Pound 700,000 per annum on average are expected. (UK)
Directory of Open Access Journals (Sweden)
Shoukkathali Anzar
2017-01-01
Full Text Available Objective: The Self-administered Leeds Assessment of Neuropathic Symptoms and Signs (S-LANSS is a 7-item self-report scale developed to identify pain which is of predominantly neuropathic origin. The aim of this study was to develop a Malayalam version of the LANSS and to test its validity and reliability in chronic pain patients. Methodology: We enrolled 101 Malayalam-speaking chronic pain patients who visited the Division of Palliative Medicine, Regional Cancer Centre, Thiruvananthapuram, Kerala, India. The translated version of S- LANSS was constructed by standard means. Fifty-one neuropathic pain and fifty nociceptive pain patients were identified by an independent pain physician and were subjected to the new pain scale by a palliative care nurse who was blinded to the diagnosis. The “gold standard diagnosis” is what the physician makes after clinical examination. Its validation, sensitivity, specificity, and positive and negative predictive values were determined. Results: Fifty-one neuropathic pain and fifty nociceptive pain patients were subjected to the Malayalam version of S-LANSS pain scale for validity testing. The agreement by Cohen's Kappa 0.743, Chi-square test P < 0.001, sensitivity 89.58, specificity 84.91, positive predictive value 84.31, negative predictive value 90.00, accuracy by 87.13, and likelihood ratio 5.94. Conclusion: The Malayalam version of S-LANSS pain scale is a validated screening tool for identifying neuropathic pain in chronic pain patients in Malayalam-speaking regions.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Citrate anticoagulation in the ICU: the Leeds experience.
Trumper, Charlotte
2016-09-08
Continuous renal replacement therapy (CRRT) is widely used in the management of critically ill patients with acute kidney injury. It requires effective anticoagulation of the extracorporeal blood circuit. Although heparin is the most commonly prescribed anticoagulant, there are issues associated with heparin, and there has been increasing interest in regional citrate anticoagulation as an alternative. In 2013, The Leeds Teaching Hospitals NHS Trust switched from heparin to citrate anticoagulant for CRRT in intensive care units (ICUs) across the Trust. This article examines the reasons for the switch, the implementation of citrate and the impact of this quality-improvement project in terms of patient outcome data and feedback from the ICU nursing team.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
A Study on the LEED Energy Simulation Process Using BIM
Directory of Open Access Journals (Sweden)
Han-Soo Ryu
2016-02-01
Full Text Available In the domestic and international environmentally friendly certification system, energy-related credit occupies a high ratio in the total distribution of certification score Leadership in the Energy and Environmental Design (LEED system is a certification system developed by the US Green Building Council (USGBC in order to assess the environmental friendliness of buildings. The energy-related credit is approximately 30% of the total and also the energy simulation ratio specifically is the highest among the single credits as it is 20%. In this research, the energy simulation process using Building Information Modeling (BIM based on the energy simulation case performed at the A-Tower, LEED certification was proposed. It places an emphasis on the verification process which was short in the previous research. The architectural geometry modeled through the BIM tool is converted to the gbXML, and in this process the geometry is verified through the interference check functions, the gbXML Viewer and the FZKViewer. The energy simulation is performed after the verification procedure. The geometry verification process in the A-Tower project is presented throughout this paper. In conclusion, an improved process is proposed for the productivity and reliability of energy simulation.
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
Radiofrequency electromagnetic fields in the Cookridge area of Leeds
Fuller, K; Judd, P M; Lowe, A J; Shaw, J
2002-01-01
On the 8 and 9 May 2002 representatives of the National Radiological Protection Board (NRPB) performed a radiofrequency electromagnetic field survey in the Cookridge area of Leeds in order to assess exposure to radio signals from transmitters mounted on a water tower/a lattice tower and a radio station tower. Guidelines on limiting exposure to radio signals have been published by NRPB and the International Commission on Non-Ionizing Radiation Protection (ICNIRP). These guidelines are designed to prevent established adverse effects on human health. During this survey, the total exposures due to all radio signals from 30 MHz to 18000 MHz (18 GHz) were measured. This frequency range was chosen as it includes mobile phone base station transmissions, which are at around 900 and 1800 MHz and super high frequency (SHF) transmissions from most of the large microwave dish antennas mounted on the towers. In addition, other major sources of radiofrequency electromagnetic fields in the environment such as broadcast radio...
Using volcano plots and regularized-chi statistics in genetic association studies.
Li, Wentian; Freudenberg, Jan; Suh, Young Ju; Yang, Yaning
2014-02-01
Labor intensive experiments are typically required to identify the causal disease variants from a list of disease associated variants in the genome. For designing such experiments, candidate variants are ranked by their strength of genetic association with the disease. However, the two commonly used measures of genetic association, the odds-ratio (OR) and p-value may rank variants in different order. To integrate these two measures into a single analysis, here we transfer the volcano plot methodology from gene expression analysis to genetic association studies. In its original setting, volcano plots are scatter plots of fold-change and t-test statistic (or -log of the p-value), with the latter being more sensitive to sample size. In genetic association studies, the OR and Pearson's chi-square statistic (or equivalently its square root, chi; or the standardized log(OR)) can be analogously used in a volcano plot, allowing for their visual inspection. Moreover, the geometric interpretation of these plots leads to an intuitive method for filtering results by a combination of both OR and chi-square statistic, which we term "regularized-chi". This method selects associated markers by a smooth curve in the volcano plot instead of the right-angled lines which corresponds to independent cutoffs for OR and chi-square statistic. The regularized-chi incorporates relatively more signals from variants with lower minor-allele-frequencies than chi-square test statistic. As rare variants tend to have stronger functional effects, regularized-chi is better suited to the task of prioritization of candidate genes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Universidad de Leeds - Gran Bretaña
Directory of Open Access Journals (Sweden)
Chamberlin, -
1977-11-01
Full Text Available Located 1,500 m from the center of the city, the Leeds University complex actively participates in city life. Designed in the 60's and built later on, this architectonic complex is outstanding because it offers an «ideal» city, perfectly integrated in the «real» city and conditioned to its own needs, to a great extent. In the beginning, this challenge of converting this university complex with a capacity for 10,000 students, in an architectonically attractive urban center met with difficulties referring to traffic and parking problems corresponding to a city as large as the one projected; this obstacle was overcome by adequate organization of underground and overhead traffic arteries which reserved large garden areas exclusively for pedestrians, freeing them from the traffic congestion and offering the pleasant and relaxed atmosphere required. The large «campus» ¡s sub-divided into different garden areas, connected one to the other and In the center of each one we have a varied and complementary architecture, which breaks with the conventional monolithic style.Situado a 1.500 m del centro de la ciudad, el conjunto universitario de Leeds participa activamente de la misma. Concebido en la década de los 60, y construido posteriormente, este complejo arquitectónico se destaca por encerrar una propuesta de ciudad «ideal», perfectamente integrada en la ciudad «real» y sujeta en buena medida a sus mismas necesidades. La alternativa de convertir a este conjunto universitario, con capacidad para 10.000 estudiantes, en un núcleo urbano arquitectónicamente atractivo, tropezó inicialmente con los condicionamientos surgidos del tráfico, circulación y estacionamiento de vehículos, correspondientes a la magnitud de la ciudad proyectada; impedimento que fue resuelto de forma adecuada mediante la organización de una red subterránea y superficial de circulación vehicular, que reserva grandes espacios verdes para la circulaci
Study of the local structure of binary surfaces by electron diffraction (XPS, LEED)
Gereová, Katarína
2006-01-01
Study of local structure of binary surface with usage of ultra-thin film of cerium deposited on a Pd (111) single-crystal surface is presented. X-ray photoelectron spectroscopy and diffraction (XPS, XPD), angle resolved UV photoemission spectroscopy (ARUPS) and low energy electron diffraction (LEED) was used for our investigations. LEED and X-ray excited photoemission intensities results represent a surface-geometrical structure. As well, mapping of ultra-violet photoelectron intensities as a...
Cornillon, Pierre-Andre; Husson, Francois; Jegou, Nicolas; Josse, Julie; Kloareg, Maela; Matzner-Lober, Eric; Rouviere, Laurent
2012-01-01
An Overview of RMain ConceptsInstalling RWork SessionHelpR ObjectsFunctionsPackagesExercisesPreparing DataReading Data from FileExporting ResultsManipulating VariablesManipulating IndividualsConcatenating Data TablesCross-TabulationExercisesR GraphicsConventional Graphical FunctionsGraphical Functions with latticeExercisesMaking Programs with RControl FlowsPredefined FunctionsCreating a FunctionExercisesStatistical MethodsIntroduction to the Statistical MethodsA Quick Start with RInstalling ROpening and Closing RThe Command PromptAttribution, Objects, and FunctionSelectionOther Rcmdr PackageImporting (or Inputting) DataGraphsStatistical AnalysisHypothesis TestConfidence Intervals for a MeanChi-Square Test of IndependenceComparison of Two MeansTesting Conformity of a ProportionComparing Several ProportionsThe Power of a TestRegressionSimple Linear RegressionMultiple Linear RegressionPartial Least Squares (PLS) RegressionAnalysis of Variance and CovarianceOne-Way Analysis of VarianceMulti-Way Analysis of Varian...
Statistical insights from Romanian data on higher education
Directory of Open Access Journals (Sweden)
Andreea Ardelean
2015-09-01
Full Text Available This paper aims to use cluster analysis to make a comparative analysis at regional level concerning the Romanian higher education. The evolution of higher education in post-communist period will also be presented, using quantitative traits. Although the focus is on university education, this will also include references to the total education by comparison. Then, to highlight the importance of higher education, the chi-square test will be applied to check whether there is an association between statistical regions and education level of the unemployed.
A Statistical Toolkit for Data Analysis
International Nuclear Information System (INIS)
Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.
2006-01-01
The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation
Directory of Open Access Journals (Sweden)
Dalton Richard
2007-01-01
Full Text Available Abstract Background In the United Kingdom (UK, there is an extensive market for the class 'A' drug heroin. Many heroin users spend time in prison. People addicted to heroin often require prescribed medication when attempting to cease their drug use. The most commonly used detoxification agents in UK prisons are buprenorphine, dihydrocodeine and methadone. However, national guidelines do not state a detoxification drug of choice. Indeed, there is a paucity of research evaluating the most effective treatment for opiate detoxification in prisons. This study seeks to address the paucity by evaluating routinely used interventions amongst drug using prisoners within UK prisons. Methods/Design The Leeds Evaluation of Efficacy of Detoxification Study (LEEDS Prisons Pilot Study will use randomised controlled trial methodology to compare the open use of buprenorphine and dihydrocodeine for opiate detoxification, given in the context of routine care, within HMP Leeds. Prisoners who are eligible and give informed consent will be entered into the trial. The primary outcome measure will be abstinence status at five days post detoxification, as determined by a urine test. Secondary outcomes during the detoxification and then at one, three and six months post detoxification will be recorded.
Directory of Open Access Journals (Sweden)
Peng Wu
2017-12-01
Full Text Available Leadership in Energy and Environmental Design (LEED is one of the most widely recognized green building rating systems. With more than 20% of the projects certified in non-United States (US countries, LEED’s global impact has been increasing and it is critically important for developers and regulatory authorities to understand LEED’s performance at the country level to facilitate global implementation. This study therefore aims to investigate the credit achievement pattern of LEED 2009, which is one of the well-developed versions of LEED, by using 4021 certified projects in the US, China, Turkey, and Brazil. The results show that significant differences can be identified on most rating categories, including sustainable sites, water efficiency, energy and atmosphere, indoor environmental quality, and innovation in design. Using a post hoc analysis, country-specific credit allocation patterns are also identified to help developers to understand existing country-specific green building practices. In addition, it is also found that there is unbalanced achievement of regional priority credits. The study offers a useful reference and benchmark for international developers and contractors to understand the regional variations of LEED 2009 and for regulatory authorities, such as the U.S. Green Building Council, to improve the rating system, especially on designing regional priority credits.
Wu, Wei
2010-01-01
Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…
Academic Training: Practical Statistics for Particle Physicists
2006-01-01
2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 9, 10, 11, 12, 13 October from 11:00 to 12:00 - Main Auditorium, bldg. 500, TH Auditorium, bldg 4, 3rd floor, on 13 October Practical Statistics for Particle Physicists L. LYONS, University of Oxford, GB Lecture 1: Learning to love the errror matrix Introductory remarks. Conditional probability. Statistical and systematic errors. Combining results Binomial, Poisson and 1-D Gaussian 2-D Gaussian and the error matrix. Understanding the covariance. Using the error matrix. Estimating the error matrix. Combining correlated measurements Lecture 2: Parameter determination by likelihood: Do's and don'ts Introduction to likelihood. Error estimate. Simple examples: (1) Breit Wigner (2) Lifetime binned and unbinned likelihood several parameters extended maximum likelihood. Common misapprehensions: Normalisation delta(lnL) = 1/2 rule and coverage Integrating the likelihood Unbinned L_max as goodness of fit Punzi effect Lecture 3: Chi-squared and hypothesis test...
Shaikh, Masood Ali
2016-04-01
Statistical tests help infer meaningful conclusions from studies conducted and data collected. This descriptive study analyzed the type of statistical tests used and the statistical software utilized for analysis reported in the original articles published in 2014 by the three Medline-indexed journals of Pakistan. Cumulatively, 466 original articles were published in 2014. The most frequently reported statistical tests for original articles by all three journals were bivariate parametric and non-parametric tests i.e. involving comparisons between two groups e.g. Chi-square test, t-test, and various types of correlations. Cumulatively, 201 (43.1%) articles used these tests. SPSS was the primary choice for statistical analysis, as it was exclusively used in 374 (80.3%) original articles. There has been a substantial increase in the number of articles published, and in the sophistication of statistical tests used in the articles published in the Pakistani Medline indexed journals in 2014, compared to 2007.
Yeandle, Sue
2016-03-14
This article explores developments in the support available to older people and carers (i.e., caregivers) in the city of Leeds, United Kingdom, and examines provision changes during a period characterized by unprecedented resource constraint and new developments in national-local governance. Using documentary evidence, official statistics, and findings from recent studies led by the author, the effects of these changes on service planning and delivery and the approach taken by local actors to mitigate their impact are highlighted. The statistical data show a marked decline in some types of services for older people during a 5-year period during which the city council took steps to mobilize citizens and develop new services and system improvements. The analysis focuses on theories of social quality as a framework for analysis of the complex picture of change related to service provision. It concludes that although citizen involvement and consultations exerted a positive influence in delivering support to some older people and carers, research over a longer timescale is needed to show if these changes are adequate to protect older people and carers from the effects of ongoing budgetary constraints.
Data analysis using the Gnu R system for statistical computation
Energy Technology Data Exchange (ETDEWEB)
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Younis Ahmed, Hamid M.
2003-01-01
The structures formed by adsorbing thin-film platinum, formic acid and oxygen on Cu{ 100} single crystal are investigated by quantitative low-energy electrondiffraction (LEED) and Temperature Programmed Reaction Spectroscopy (TPRS) Symmetrized Automated Tensor LEED (SATLEED) calculations are used to determine the structure of the formed surface alloys and overlayers. TPRS was used to probe the surface reactivity of the systems studied while surface composition was obtained using Auger Electro...
An institutional approach: education for sustainable development at the University of Leeds
Purvis, M; Young, CW; Marsh, C; Clarke, J
2013-01-01
Central to the strategic vision of the University of Leeds is the reaffirmation of the University’s commitment to provide an exceptional student experience centred on inspirational learning and teaching, grounded in world-class research. A key component of this vision is a major curriculum enhancement project. This chapter outlines the intent of this project, which reinforces existing provision that challenges undergraduate students to broaden their academic horizons and develop their capacit...
Directory of Open Access Journals (Sweden)
Asli Pelin Gurgun
2018-02-01
Full Text Available Compared to other categories, the Energy and Atmosphere category contributes the most to the maximum obtainable points in the Leadership in Energy and Environmental Design (LEED certification system. The objective of the study was to identify the extent to which project teams take advantage of the credits in the Energy and Atmosphere category of LEED. This study analyzes the performance of practitioners in achieving points in the Energy and Atmosphere credits of LEED-New Construction (NC 2009 for 1500 buildings that received LEED certification in the US. For a better understanding of the credit patterns, the differences in the performance of practitioners are investigated relative to certification levels and project ownership. Achievement in credits is calculated in terms of percent of maximum points (PMP, since the maximum achievable points differ for each credit. Practitioners’ achievements in the credits were ranked as follows: (1 enhanced commissioning, (2 optimized energy performance, (3 enhanced refrigerant management, (4 green power, (5 measurement and verification, and (6 on-site renewable energy. The largest achievement differences were observed in the on-site renewable energy credit. Concerning building ownership, investors were found to optimize mostly energy efficiency and on-site renewable energy, but to mostly skip enhanced refrigerant management. Performance in the measurement and verification credit was similar for all owner types, whereas investors performed differently from corporations, and government agencies in the enhanced commissioning credit. Practitioners who recognize these priorities and differences are expected to be better positioned to make sustainability-related decisions in building design and construction.
LEED structural analysis of GaAs(001)-c(4X4) surface
Czech Academy of Sciences Publication Activity Database
Romanyuk, Olexandr; Jiříček, Petr; Cukr, Miroslav; Bartoš, Igor
566-568, - (2004), s. 89-93 ISSN 0039-6028 R&D Projects: GA AV ČR IAA1010108 Institutional research plan: CEZ:AV0Z1010914 Keywords : electron-solid interactions * low energy electron diffraction(LEED) * molecular beam epitaxy(MBE) * surface relaxation and reconstruction * gallium arsenide * low index single crystal scattering * diffraction Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.168, year: 2004
The association between the geography of fast food outlets and childhood obesity rates in Leeds, UK.
Fraser, Lorna K; Edwards, Kimberley L
2010-11-01
To analyse the association between childhood overweight and obesity and the density and proximity of fast food outlets in relation to the child's residential postcode. This was an observational study using individual level height/weight data and geographic information systems methodology. Leeds in West Yorkshire, UK. This area consists of 476 lower super-output areas. Children aged 3-14 years who lived within the Leeds metropolitan boundaries (n=33,594). The number of fast food outlets per area and the distance to the nearest fast food outlet from the child's home address. The weight status of the child: overweight, obese or neither. 27.1% of the children were overweight or obese with 12.6% classified as obese. There is a significant positive correlation (pfast food outlets and higher deprivation. A higher density of fast food outlets was significantly associated (p=0.02) with the child being obese (or overweight/obese) in the generalised estimating equation model which also included sex, age and deprivation. No significant association between distance to the nearest fast food outlet and overweight or obese status was found. There is a positive relationship between the density of fast food outlets per area and the obesity status of children in Leeds. There is also a significant association between fast food outlet density and areas of higher deprivation. Copyright © 2010 Elsevier Ltd. All rights reserved.
Indoor environmental quality differences between office types in LEED-certified buildings in the US
Energy Technology Data Exchange (ETDEWEB)
Lee, Young S. [School of Planning, Design, and Construction, Michigan State University, East Lansing, MI 48823 (United States); Guerin, Denise A. [College of Design, University of Minnesota, Twin Cities, MN 55108 (United States)
2010-05-15
The study compared IAQ, thermal quality, and lighting quality between 5 different office types in LEED-certified buildings in relation to employees' environmental satisfaction and their job performance. This was to provide workplaces where workers in each specific office environment could be provided with appropriate office settings regarding these IEQ criteria when organizations comply with LEED standards. The five types of office included private enclosed, private shared, open-plan with high cubicle over 5', open-plan with low cubicle lower than 5', and open-plan with no partitions (bullpen) offices. The study found IAQ enhanced workers' job performance in enclosed private offices more than both high cubicles and low cubicles. All four office types had higher satisfaction with the amount of light and visual comfort of light as well as more enhancement with job performance due to lighting quality than high cubicles. There was no difference in thermal quality between the five office types. IAQ and lighting quality were not different between enclosed private, enclosed shared, and bullpen office types, either. The study findings suggest a careful workplace design considering the height of partitions in LEED-certified buildings to improve employee's environmental satisfaction and job performance. (author)
Investigation of reordered (001) Au surfaces by positive ion channeling spectroscopy, LEED and AES
International Nuclear Information System (INIS)
Appleton, B.R.; Noggle, T.S.; Miller, J.W.; Schow, O.E. III; Zehner, D.M.; Jenkins, L.H.; Barrett, J.H.
1974-01-01
As a consequence of the channeling phenomenon of positive ions in single crystals, the yield of ions Rutherford scattered from an oriented single crystal surface is dependent on the density of surface atoms exposed to the incident ion beam. Thus, the positive ion channeling spectroscopy (PICS) technique should provide a useful tool for studying reordered surfaces. This possibility was explored by examining the surfaces of epitaxially grown thin Au single crystals with the combined techniques of LEED-AES and PICS. The LEED and AES investigations showed that when the (001) surface was sputter cleaned in ultra-high vacuum, the normal (1 x 1) symmetry of the (001) surfaces reordered into a structure which gave a complex (5 x 20) LEED pattern. The yield and energy distributions of 1 MeV He ions scattered from the Au surfaces were used to determine the number of effective monolayers contributing to the normal and reordered surfaces. These combined measurements were used to characterize the nature of the reordered surface. The general applicability of the PICS technique for investigations of surface and near surface regions is discussed
Directory of Open Access Journals (Sweden)
Jose Manuel Diaz-Sarachaga
2018-02-01
Full Text Available The unstoppable world population growth is increasing the concentration of people in urban settlements and the number of megacities, especially in developing countries where urbanization exacerbates social and economic inequalities. Green rating systems have been launched during the last decades to facilitate the assessment of sustainable development in terms of building and infrastructure, including the evaluation of sustainable urban development through the study of communities. This article assesses two of the most renowned sustainable rating systems through the prism of economy, environment and society and the international actions undertaken toward the promotion of sustainable development worldwide, in order to determine their effectiveness to assess urban development in poorer nations. Hence, Leadership in Energy and Environmental Design for Neighbourhood Development (LEED ND and Envision, both from the United States, were chosen as representatives of building and infrastructure fields, respectively, so that the Sustainable Development Goals (SDGs and the New Urban Agenda (Habitat III were the benchmarks selected to define the sustainability aspects required to evaluate their potential application in less developed countries. The absence of metrics in the New Urban Agenda led to relate its commitments to the SDGs, which revealed that the prerequisites and credits included in LEED ND and Envision mainly focused on managerial and environmental aspects and disregarded the economic and social dimensions. Consequently, the premises under which LEED ND and Envision were developed must be updated and complemented with the two latest guidelines recently adopted by the United Nations in the field of urban and sustainable development.
2011-01-01
Background Interest is growing in physical activity-friendly community designs, but few tests exist of communities explicitly designed to be walkable. We test whether students living in a new urbanist community that is also a pilot LEED_ND (Leadership in Energy and Environmental Design-Neighborhood Development) community have greater accelerometer-measured moderate-to-vigorous physical activity (MVPA) across particular time periods compared to students from other communities. We test various time/place periods to see if the data best conform to one of three explanations for MVPA. Environmental effects suggest that MVPA occurs when individuals are exposed to activity-friendly settings; selection effects suggest that walkable community residents prefer MVPA, which leads to both their choice of a walkable community and their high levels of MVPA; catalyst effects occur when walking to school creates more MVPA, beyond the school commute, on schooldays but not weekends. Methods Fifth graders (n = 187) were sampled from two schools representing three communities: (1) a walkable community, Daybreak, designed with new urbanist and LEED-ND pilot design standards; (2) a mixed community (where students lived in a less walkable community but attended the walkable school so that part of the route to school was walkable), and (3) a less walkable community. Selection threats were addressed through controlling for parental preferences for their child to walk to school as well as comparing in-school MVPA for the walkable and mixed groups. Results Minutes of MVPA were tested with 3 × 2 (Community by Gender) analyses of covariance (ANCOVAs). Community walkability related to more MVPA during the half hour before and after school and, among boys only, more MVPA after school. Boys were more active than girls, except during the half hour after school. Students from the mixed and walkable communities--who attended the same school--had similar in-school MVPA levels, and community groups
Statistical analysis of surgical pathology data using the R program.
Cuff, Justin; Higgins, John P T
2012-05-01
An understanding of statistics is essential for analysis of many types of data including data sets typically reported in surgical pathology research papers. Fortunately, a relatively small number of statistical tests apply to data relevant to surgical pathologists. An understanding of when to apply these tests would greatly benefit surgical pathologists who read and/or write papers. In this review, we show how the publicly available statistical program R can be used to analyze recently published surgical pathology papers to replicate the p-values and survival curves presented in these papers. Areas covered include: T-test, chi-square and Fisher exact tests of proportionality, Kaplan-Meier survival curves, the log rank test, and Cox proportional hazards.
Computation of the Percentage Points of the Chi-Square Distribution
1977-04-01
344 1 10 NL N 1~0 (r, P) 1N’ N ") A ATUN 4q T I-)N a .9W" 0 (A s tf% o 1 .4 .4 ifN #Im W ý 4 00UN JU’N N JN 3 1 MIn10 %3 it 0 1) P-N N NI N0 7 C ) k...Avenue Kirtland Air Force Base Silver Spring, Maryland 20910 Albuquerque. Newv Mexico 87117 Attn: Strategic Analysis Support Group Document Librarian...Los Alamios Scientific Liboratory P.O. Box 1663 Prof. George F. Carrier Los Alamios, New Mexico 87544 Pierce Hall, Room 311 Attn: Report Library
Learning Word Embeddings with Chi-Square Weights for Healthcare Tweet Classification
Directory of Open Access Journals (Sweden)
Sicong Kuang
2017-08-01
Full Text Available Twitter is a popular source for the monitoring of healthcare information and public disease. However, there exists much noise in the tweets. Even though appropriate keywords appear in the tweets, they do not guarantee the identification of a truly health-related tweet. Thus, the traditional keyword-based classification task is largely ineffective. Algorithms for word embeddings have proved to be useful in many natural language processing (NLP tasks. We introduce two algorithms based on an existing word embedding learning algorithm: the continuous bag-of-words model (CBOW. We apply the proposed algorithms to the task of recognizing healthcare-related tweets. In the CBOW model, the vector representation of words is learned from their contexts. To simplify the computation, the context is represented by an average of all words inside the context window. However, not all words in the context window contribute equally to the prediction of the target word. Greedily incorporating all the words in the context window will largely limit the contribution of the useful semantic words and bring noisy or irrelevant words into the learning process, while existing word embedding algorithms also try to learn a weighted CBOW model. Their weights are based on existing pre-defined syntactic rules while ignoring the task of the learned embedding. We propose learning weights based on the words’ relative importance in the classification task. Our intuition is that such learned weights place more emphasis on words that have comparatively more to contribute to the later task. We evaluate the embeddings learned from our algorithms on two healthcare-related datasets. The experimental results demonstrate that embeddings learned from the proposed algorithms outperform existing techniques by a relative accuracy improvement of over 9%.
Results from the Cryogenic Dark Matter Search Using a Chi Squared Analysis
Energy Technology Data Exchange (ETDEWEB)
Sander, Joel [UC, Santa Barbara
2007-12-01
Most of the mass-energy density of the universe remains undetected and is only understood through its affects on visible, baryonic matter. The visible, baryonic matter accounts for only about half of a percent of the universe's total mass-energy budget, while the remainder of the mass-energy of the universe remains dark or undetected. About a quarter of the dark mass-energy density of the universe is comprised of massive particles that do not interact via the strong or electromagnetic forces. If these particles interact via the weak force, they are termed weakly interacting massive particles or WIMPs, and their interactions with baryonic matter could be detectable. The CDMS II experiment attempts to detect WIMP interactions in the Soudan Underground Laboratory using germanium detectors and silicon detectors. A WIMP can interact a with detector nuclei causing the nuclei to recoil. A nuclear recoil is distinguished from background electron recoils by comparing the deposited ionization and phonon energies. Electron recoils occurring near detector surfaces are more difficult to reject. This thesis describes the results of a χ2 analysis designed to reject events occurring near detector surfaces. Because no WIMP signal was observed, separate limits using the germanium and silicon detectors are set on the WIMP cross section under standard astrophysical assumptions.
Directory of Open Access Journals (Sweden)
Sheard Laura
2004-04-01
Full Text Available Abstract Background Heroin is a synthetic opioid with an extensive illicit market leading to large numbers of people becoming addicted. Heroin users often present to community treatment services requesting detoxification and in the UK various agents are used to control symptoms of withdrawal. Dissatisfaction with methadone detoxification 8 has lead to the use of clonidine, lofexidine, buprenorphine and dihydrocodeine; however, there remains limited evaluative research. In Leeds, a city of 700,000 people in the North of England, dihydrocodeine is the detoxification agent of choice. Sublingual buprenorphine, however, is being introduced. The comparative value of these two drugs for helping people successfully and comfortably withdraw from heroin has never been compared in a randomised trial. Additionally, there is a paucity of research evaluating interventions among drug users in the primary care setting. This study seeks to address this by randomising drug users presenting in primary care to receive either dihydrocodeine or buprenorphine. Methods/design The Leeds Evaluation of Efficacy of Detoxification Study (LEEDS project is a pragmatic randomised trial which will compare the open use of buprenorphine with dihydrocodeine for illicit opiate detoxification, in the UK primary care setting. The LEEDS project will involve consenting adults and will be run in specialist general practice surgeries throughout Leeds. The primary outcome will be the results of a urine opiate screening at the end of the detoxification regimen. Adverse effects and limited data to three and six months will be acquired.
National Research Council Canada - National Science Library
Thomas, Benjamin J
2008-01-01
.... Taking this vision into account, the individual credits that comprise LEED are designed to reward design teams for employing sustainable design strategies that reduce the total environmental impact...
Which statistics should tropical biologists learn?
Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián
2011-09-01
Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.
Directory of Open Access Journals (Sweden)
Razieh Nilforooshan
2013-10-01
Full Text Available This paper presents ongoing research aimed at investigating the efficacy of computer animations in improving college students’ learning of building sustainability concepts and practices. The use of animations in educational contexts is not new, however scientific evidence that supports their effectiveness as educational materials is still limited. This paper reports an experiment that explored the impact of an educational digital animation, called “LEED-ERS”, on college students’ learning of Leadership in Energy and Environmental Design (LEED rating system. Specifically, the animation focused on the LEED category of Sustainable Site. Results of a study with 68 students show that viewing the animation led to an increase in subjects’ declarative knowledge by 15%. Compared to traditional learning methods (e.g. reading assignments with static images, viewing the animation led to significantly higher declarative knowledge gains.
International Nuclear Information System (INIS)
Koestner, R.J.
1982-08-01
There have only been a few Low Energy Electron Diffraction (LEED) intensity analyses carried out to determine the structure of molecules adsorbed on metal surfaces; most surface crystallography studies concentrated on the structure of clean unreconstructed or atomic adsorbate-covered transition metal faces. The few molecular adsorption systems already investigated by dynamical LEED are CO on Ni(100), Cu(100) and Pd(100) as well as C 2 H 2 and C 2 H 4 adsorbed on Pt(111). The emphasis of this thesis research has been to extend the applicability of LEED crystallography to the more complicated unit cells found in molecular overlayers on transition metals or in there constructed surfaces of clean transition metals
Energy Technology Data Exchange (ETDEWEB)
Koestner, R.J.
1982-08-01
There have only been a few Low Energy Electron Diffraction (LEED) intensity analyses carried out to determine the structure of molecules adsorbed on metal surfaces; most surface crystallography studies concentrated on the structure of clean unreconstructed or atomic adsorbate-covered transition metal faces. The few molecular adsorption systems already investigated by dynamical LEED are CO on Ni(100), Cu(100) and Pd(100) as well as C/sub 2/H/sub 2/ and C/sub 2/H/sub 4/ adsorbed on Pt(111). The emphasis of this thesis research has been to extend the applicability of LEED crystallography to the more complicated unit cells found in molecular overlayers on transition metals or in there constructed surfaces of clean transition metals.
Academic Reading Strategies used by Leeds Metropolitan University Graduates: A Case Study
Directory of Open Access Journals (Sweden)
Samira Sohail
2015-12-01
Full Text Available Academic reading is different from other forms of reading because it is complex and discipline-specific. It involves a measured, challenging, and multifaceted process in which students are dynamically engaged with a range of reading strategies. Academic reading improvement is possible, provided students work on it and there are no short cuts or remedies which will cure the reading problems. Reading improvement is hard work and a difficult task, but it is rewarding as well. This study examined the selection and use of academic reading strategies used by the undergraduate and postgraduate students studying at Leeds Metropolitan University, Headingley Campus, Leeds. A quantitative data study was carried out to investigate three aspects of academic reading strategies: (a efficiency, (b interacting with texts, and (c critical reading strategies. The results of this survey suggest that the participants on balance have proficient reading skills, but a significant number of participants have ineffective reading strategies and bad reading habits. Recommendations and suggestions have been put forward to improve academic reading strategies and for further research.
Rodrigues, David; Prada, Marília; Gaspar, Rui; Garrido, Margarida V; Lopes, Diniz
2018-02-01
The use of emoticons and emoji is increasingly popular across a variety of new platforms of online communication. They have also become popular as stimulus materials in scientific research. However, the assumption that emoji/emoticon users' interpretations always correspond to the developers'/researchers' intended meanings might be misleading. This article presents subjective norms of emoji and emoticons provided by everyday users. The Lisbon Emoji and Emoticon Database (LEED) comprises 238 stimuli: 85 emoticons and 153 emoji (collected from iOS, Android, Facebook, and Emojipedia). The sample included 505 Portuguese participants recruited online. Each participant evaluated a random subset of 20 stimuli for seven dimensions: aesthetic appeal, familiarity, visual complexity, concreteness, valence, arousal, and meaningfulness. Participants were additionally asked to attribute a meaning to each stimulus. The norms obtained include quantitative descriptive results (means, standard deviations, and confidence intervals) and a meaning analysis for each stimulus. We also examined the correlations between the dimensions and tested for differences between emoticons and emoji, as well as between the two major operating systems-Android and iOS. The LEED constitutes a readily available normative database (available at www.osf.io/nua4x ) with potential applications to different research domains.
Characterization of Si(112) and In/Si(112) studied by SPA-LEED
Energy Technology Data Exchange (ETDEWEB)
Hoecker, Jan; Speckmann, Moritz; Schmidt, Thomas; Falta, Jens [Institute of Solid State Physics, University of Bremen, 28359 Bremen (Germany)
2010-07-01
High index surfaces are of strong interest in todays research because of the possibility to grow low dimensional structures. It has for instance already been shown that the adsorption of Ga can induce the formation of 1D metal chains on Si(112) (cf. Snijders et al., PRB 72, 2005). In this work we investigated the clean Si(112) surface and the adsorption of In on Si(112) to establish an analogy to Ga/Si(112) using spot profile analyzing low energy electron diffraction (SPA-LEED). By means of reciprocal space mapping we determined the bare Si(112) surface to be decomposed into alternating (5512) and (111) facets in [1 anti 10] direction with (2 x 1) and (7 x 7) reconstruction, respectively (cf. Baski et al., Surf. Sci. 392, 1997). With SPA-LEED we were able to observe the decreasing intensity of the facet spots in-situ while depositing In on Si(112) and thus reveal the smoothening of the surface due to the deposition of In. At saturation coverage we found a (3.x1) reconstruction, where x is dependent on the deposition temperature and changes from x=7 at 400 C to x=5 at 500 C. This leads us to the assumption that the reconstruction is not incommensurate but a mixture of (3 x 1) and (4 x 1) building blocks, which is very similar to the super structure of Ga on Si(112).
International Nuclear Information System (INIS)
Burnell, S.A.; Hamilton, D.J.
1984-01-01
A set of specialised phantoms for the non-invasive testing of the image intensifiers and associated television equipment fitted to fluoroscopic x-ray machines has been developed in the Medical Physics Department of the University of Leeds. The Radiation Control Section of the South Australian Health Commission has acquired a set of the Leeds Test Tools to use in its program of inspecting and testing diagnotic x-ray equipment. The tools and their use are described, and some preliminary results for South Australia are given
Energy Technology Data Exchange (ETDEWEB)
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
Energy Technology Data Exchange (ETDEWEB)
Zerres, Eberhard
2011-07-01
In order to receive the eco-labeled LEED platinum category, a good planning is essential. In the construction of a new administration building in Ratingen (Federal Republic of Germany), many details have been considered up to the use of ecologically unquestionable building materials. Thus, these details were very purposeful.
Energy Technology Data Exchange (ETDEWEB)
Paetz, Christian
2012-07-01
In the western Canadian city of Victoria a new residential quarter and commercial quarter is uprisen on the area of a closed down harbour. The ambiguous goal: Every 26 buildings of this project shall achieve the LEED certification of the highest level: platinum.
On the contraction of the W(001)-(1x1) surface using LEED intensity analysis
International Nuclear Information System (INIS)
Read, M.N.; Russell, G.J.
1979-01-01
Three recent independent attempts at reducing the W(001)-(1x1) surface structure by LEED beam intensity analysis have yielded contractions of the topmost layer spacing of 6+-6%, 11+-2%, 4.4+-3% normal to the surface plane. The authors investigate possible reasons for the discrepancies by comparing published experimental and theoretical profiles of these workers as well as their own. Their main conclusions are that the direct comparison of experimental data of different investigators shows deviations which are comparable to the changes in the calculated profiles for various surface contractions. Also the deviations between calculated intensity profiles using different (but still realistic) assumed scattering potentials are comparable to the changes in the calculated profiles for various surface contractions. (Auth.)
Directory of Open Access Journals (Sweden)
S. Rick Fedrizzi
2014-06-01
Full Text Available Il presente lavoro ha l’obiettivo di delineare gli aspetti chiave della sostenibilità in ambito edilizio focalizzando l’attenzione sul sistema di certificazione LEED® quale strumento “universale” di supporto per la realizzazione, gestione e valutazione di edifici sostenibili. Nella prima parte del lavoro si descrive la rapida diffusione della certificazione LEED nel recente passato quale diretta conseguenza della capacità di questo strumento di rating di adattarsi sia alle specifiche tipologie di edifici, sia alle diversità climatiche e morfologiche dei siti. Nella seconda parte si procede invece a presentare ed analizzare gli aspetti economico-finanziari degli edifici sostenibili con riferimento sia alle metodologie valutative applicabili, sia ai dati della letteratura. Partendo dalle esperienze internazionali in tema di sostenibilità, si procede successivamente a descrivere la situazione italiana, evidenziando la percezione del mercato e le opportunità di sviluppo future.
The UPD of copper on Pt(100): a first quantitative structure determination by LEED
Aberdam, D.; Gauthier, Y.; Durand, R.; Faure, R.
1994-04-01
The adsorption of copper on platinum, obtained by electrochemical underpotential deposition (UPD), is a complex phenomenon. As observed by cyclic voltammetry, the underpotential is not limited to a narrow, well defined range of electrochemical potentials, but is spread out on a wide range of potentials. On the Pt(100) surface, a lack of reversibility of adsorption and desorption occurs, and a gradual change in the voltammogram shape takes place under potential cycling. In this paper, we describe a first "ex situ" low energy electron diffraction (LEED) structure investigation of that system, for a copper coverage of about {2}/{3} of a monolayer. The main result is that copper clusters in 2D islands with p(1 × 1) structure and a density of one Cu per Pt atom.
“THE LEEDS IDEA”: AN HISTORICAL ACCOUNT OF THE SPONDARTHRITIS CONCEPT
Directory of Open Access Journals (Sweden)
J.M.H. Moll
2011-09-01
Full Text Available SUMMARY In the 1960s, Professor Verna Wright became increasingly interested in possible relationships between certain seronegative “variants of rheumatoid arthritis”, as they were then generally known. At the Rheumatism Research Unit, a department within the division of medicine at Leeds University, he gathered around him a succession of research workers, whom he inspired to study aspects of these relationships. The focus was on family studies, as it was thought that genetic factors could be important. The striking association previously noted between sacroiliitis or full-blown ankylosing spondylitis and several of these disorders to be studied - e.g., psoriatic arthritis, ulcerative colitis, and the arthritis associated with Crohn’s disease - was to be central for each of these studies. As a provisional collective name for these possibly related conditions, the term “Spondarthritides” was chosen. These were the days before HLA B27, and so the research tools were simply clinical, radiological (for sacroiliitis and serological (for rheumatoid factor. The research programme confirmed not only links between the primary disorders with ankylosing spondylitis, but also links between the disorders themselves. Over subsequent years, the spondarthritis concept (dubbed by some “The Leeds Idea” has gained further strength from HLA studies internationally. And membership of the group of conditions fulfilling spondarthritis criteria has grown substantially. It is hoped that this now consolidated framework of spondylitis-related entities will pave the way for further research, with exciting prospects of gene-based prevention and/or cure through the increasing sophistication of molecular biology. Key words: Seronegative spondarthritides, psoriatic arthritis, ankylosing spndylitis
Statistical power of likelihood ratio and Wald tests in latent class models with covariates.
Gudicha, Dereje W; Schmittmann, Verena D; Vermunt, Jeroen K
2017-10-01
This paper discusses power and sample-size computation for likelihood ratio and Wald testing of the significance of covariate effects in latent class models. For both tests, asymptotic distributions can be used; that is, the test statistic can be assumed to follow a central Chi-square under the null hypothesis and a non-central Chi-square under the alternative hypothesis. Power or sample-size computation using these asymptotic distributions requires specification of the non-centrality parameter, which in practice is rarely known. We show how to calculate this non-centrality parameter using a large simulated data set from the model under the alternative hypothesis. A simulation study is conducted evaluating the adequacy of the proposed power analysis methods, determining the key study design factor affecting the power level, and comparing the performance of the likelihood ratio and Wald test. The proposed power analysis methods turn out to perform very well for a broad range of conditions. Moreover, apart from effect size and sample size, an important factor affecting the power is the class separation, implying that when class separation is low, rather large sample sizes are needed to achieve a reasonable power level.
Appleton, K; House, A; Dowell, A
1998-03-01
The past seven years have seen rapid changes in general practice in the United Kingdom (UK), commencing with the 1990 contract. During the same period, concern about the health and morale of general practitioners (GPs) has increased and a recruitment crisis has developed. To determine levels of psychological symptoms, job satisfaction, and subjective ill health in GPs and their relationship to practice characteristics, and to compare levels of job satisfaction since the introduction of the 1990 GP contract with those found before 1990. Postal questionnaire survey of all GP principals on the Leeds Health Authority list. The main outcome measures included quantitative measures of practice characteristics, job satisfaction, mental health (General Health Questionnaire), and general physical health. Qualitative statements about work conditions, job satisfaction, and mental health were collected. A total of 285/406 GPs (70%) returned the questionnaires. One hundred and forty-eight (52%) scored 3 or more on the General Health Questionnaire (GHQ-12), which indicates a high level of psychological symptoms. One hundred and sixty GPs (56%) felt that work had affected their recent physical health. Significant associations were found between GHQ-12 scores, total job satisfaction scores, and GPs' perceptions that work had affected their physical health. Problems with physical and mental health were associated with several aspects of workload, including list size, number of sessions worked per week, amount of time spent on call, and use of deputizing services. In the qualitative part of the survey, GPs reported overwork and excessive hours, paperwork and administration, recent National Health Service (NHS) changes, and the 1990 GP contract as the most stressful aspects of their work. Fifty-two per cent of GPs in Leeds who responded showed high levels of psychological symptoms. Job satisfaction was lower than in a national survey conducted in 1987, and GPs expressed the least
National Research Council Canada - National Science Library
Thomas, Benjamin J
2008-01-01
The Leadership in Energy and Environmental Design (LEED) Building Assessment System is a performance-based tool for determining the environmental impact of a facility from the whole-building perspective...
Structural and electronic analysis of Hf on Si(1 1 1) surface studied by XPS, LEED and XPD
Energy Technology Data Exchange (ETDEWEB)
Carazzolle, M.F. [Experimentelle Physik 1, Universitaet Dortmund, Otto-Hahn-Str. 4, D44221 Dortmund (Germany); Instituto de Fisica, Universidade Estadual de Campinas, C.P. 6165, 13083-970 Campinas, SP (Brazil)], E-mail: mcarazzo@ifi.unicamp.br; Schuermann, M.; Fluechter, C.R.; Weier, D. [Experimentelle Physik 1, Universitaet Dortmund, Otto-Hahn-Str. 4, D44221 Dortmund (Germany); Berges, U. [Experimentelle Physik 1, Universitaet Dortmund, Otto-Hahn-Str. 4, D44221 Dortmund (Germany); DELTA, Universitaet Dortmund, Maria-Goeppert-Mayer-Str. 2, D44227 Dortmund (Germany); Siervo, A. de [Laboratorio Nacional de Luz Sincrotron, C.P. 6192, 13084-971 Campinas, SP (Brazil); Landers, R. [Instituto de Fisica, Universidade Estadual de Campinas, C.P. 6165, 13083-970 Campinas, SP (Brazil); Laboratorio Nacional de Luz Sincrotron, C.P. 6192, 13084-971 Campinas, SP (Brazil); Kleiman, G.G. [Instituto de Fisica, Universidade Estadual de Campinas, C.P. 6165, 13083-970 Campinas, SP (Brazil); Westphal, C. [Experimentelle Physik 1, Universitaet Dortmund, Otto-Hahn-Str. 4, D44221 Dortmund (Germany); DELTA, Universitaet Dortmund, Maria-Goeppert-Mayer-Str. 2, D44227 Dortmund (Germany)
2007-05-15
In this work, we present a systematic electronic and structural study of the Hf-silicide formation upon annealing on Si(1 1 1) surface. The electronic structure and surface composition were determined by X-ray photoelectron spectroscopy (XPS) and angle-resolved X-ray photoelectron spectroscopy (ARXPS). To determine the atomic structure of the surface alloy we used low energy electron diffraction (LEED) and angle-resolved photoelectron diffraction (XPD). It was possible to verify that, after 600 deg. C annealing, there is alloy formation and after 700 deg. C the Hf diffusion process is predominant. Using LEED and XPD measurements we detected the ordered island formation simultaneously with alloy formation.
Analysis of non-spherical grid geometry for distortion-free LEED apparatus with micro channel plate
International Nuclear Information System (INIS)
Okano, Tatsuo; Ohsaki, Akihiko; Sakurai, Makoto; Honda, Tohru; Tuzi, Yutaka
1985-01-01
A design of non-spherical grid structure for the distortion-free LEED apparalus with a micro channel plate (MCP) is described. The grid structure is assumed as an interface of two electrostatic potentials. The potential interface refracts the diffracted electrons and the LEED patterns can be projected on the MCP just like those observed on a spherical fluorescent screen. The shape of the potential interface is described by a differential equation and numerically calculated for several conditions. The most appropriate geometry is determined by the easiness of the mechanical construction. The effect of energy distribution of diffracted electrons is numerically estimated and the deviation is proved to be negligibly small for most applications. (author)
Directory of Open Access Journals (Sweden)
Maxwell Grady
2018-02-01
Full Text Available PLEASE, the Python Low-energy Electron Analysis SuitE, provides an open source and cross-platform graphical user interface (GUI for rapid analysis and visualization of low energy electron microscopy (LEEM data sets. LEEM and the associated technique, selected area micro-spot low energy electron diffraction (μ-LEED, are powerful tools for analysis of the surface structure for many novel materials. Specifically, these tools are uniquely suited for the characterization of two-dimensional materials. PLEASE offers a user-friendly point-and-click method for extracting intensity-voltage curves from LEEM and LEED data sets. Analysis of these curves provides insight into the atomic structure of the target material surface with unparalleled resolution.
2006-01-01
Child Development Center Installation: Fort McPherson Project Manager: Morris, Timothy C. Project Status: Approved District SSD POC: Milton, Judith F...Number: 053321 Project Description: Recruiting Brigade Operations B Installation: Fort Gillem Project Manager: Morris, Timothy C Project Status...Martinez, Stephen B. Project Status: oved District SSD POC: Raney , Jeff P. SPiRiT (Actual): 61 (Gold) LEED (Estimated): 31 (Silver) LEED
Directory of Open Access Journals (Sweden)
Bibby John
2010-05-01
Full Text Available Abstract Background The National Institute for Health Research (NIHR has funded nine Collaborations for Leadership in Applied Health Research and Care (CLAHRCs. Each CLAHRC is a partnership between higher education institutions (HEIs and the NHS in nine UK regional health economies. The CLAHRC for Leeds, York, and Bradford comprises two 'research themes' and three 'implementation themes.' One of these implementation themes is Translating Research into Practice in Leeds and Bradford (TRiPLaB. TRiPLaB aims to develop, implement, and evaluate methods for inducing and sustaining the uptake of research knowledge into practice in order to improve the quality of health services for the people of Leeds and Bradford. Methods TRiPLaB is built around a three-stage, sequential, approach using separate, longitudinal case studies conducted with collaborating NHS organisations, TRiPLaB will select robust innovations to implement, conduct a theory-informed exploration of the local context using a variety of data collection and analytic methods, and synthesise the information collected to identify the key factors influencing the uptake and adoption of targeted innovations. This synthesis will inform the development of tailored, multifaceted, interventions designed to increase the translation of research findings into practice. Mixed research methods, including time series analysis, quasi-experimental comparison, and qualitative process evaluation, will be used to evaluate the impact of the implementation strategies deployed. Conclusion TRiPLaB is a theory-informed, systematic, mixed methods approach to developing and evaluating tailored implementation strategies aimed at increasing the translation of research-based findings into practice in one UK health economy. Through active collaboration with its local NHS, TRiPLaB aims to improve the quality of health services for the people of Leeds and Bradford and to contribute to research knowledge regarding the
Using Mathematica to build Non-parametric Statistical Tables
Directory of Open Access Journals (Sweden)
Gloria Perez Sainz de Rozas
2003-01-01
Full Text Available In this paper, I present computational procedures to obtian statistical tables. The tables of the asymptotic distribution and the exact distribution of Kolmogorov-Smirnov statistic Dn for one population, the table of the distribution of the runs R, the table of the distribution of Wilcoxon signed-rank statistic W+ and the table of the distribution of Mann-Whitney statistic Ux using Mathematica, Version 3.9 under Window98. I think that it is an interesting cuestion because many statistical packages give the asymptotic significance level in the statistical tests and with these porcedures one can easily calculate the exact significance levels and the left-tail and right-tail probabilities with non-parametric distributions. I have used mathematica to make these calculations because one can use symbolic language to solve recursion relations. It's very easy to generate the format of the tables, and it's possible to obtain any table of the mentioned non-parametric distributions with any precision, not only with the standard parameters more used in Statistics, and without transcription mistakes. Furthermore, using similar procedures, we can generate tables for the following distribution functions: Binomial, Poisson, Hypergeometric, Normal, x2 Chi-Square, T-Student, F-Snedecor, Geometric, Gamma and Beta.
LEED, Its Efficacy and Fallacy in a Regional Context—An Urban Heat Island Case in California
Directory of Open Access Journals (Sweden)
Min Ho Shin
2017-09-01
Full Text Available The use of energy in the building sector has increased rapidly over the past two decades. Accordingly, various building assessment methods have developed in green building practices. However, the questions still remain in regard to how positively green buildings affect regional surroundings. This study investigates the possible relationship between LEED-certified buildings and urban heat island effect. Using GIS with spatial regression, the study found that constructing an LEED building in a 30-m boundary could possibly lower the temperature of the surrounding environment by 0.35 °C. Also, having a higher certification level, such as Gold or Platinum, increased the lowering effect by 0.48 °C, while a lower certification level, such as Certified or Silver, had a lowering effect of 0.26 °C. Although LEED has gained a substantial amount of interest and skepticism at the same time, the study results could be a potential sign that the Sustainable Sites Credits or energy-efficient materials play a positive role in lowering the temperature.
Al-Ghamdi, Sami G; Bilec, Melissa M
2015-04-07
This research investigates the relationship between energy use, geographic location, life cycle environmental impacts, and Leadership in Energy and Environmental Design (LEED). The researchers studied worldwide variations in building energy use and associated life cycle impacts in relation to the LEED rating systems. A Building Information Modeling (BIM) of a reference 43,000 ft(2) office building was developed and situated in 400 locations worldwide while making relevant changes to the energy model to meet reference codes, such as ASHRAE 90.1. Then life cycle environmental and human health impacts from the buildings' energy consumption were calculated. The results revealed considerable variations between sites in the U.S. and international locations (ranging from 394 ton CO2 equiv to 911 ton CO2 equiv, respectively). The variations indicate that location specific results, when paired with life cycle assessment, can be an effective means to achieve a better understanding of possible adverse environmental impacts as a result of building energy consumption in the context of green building rating systems. Looking at these factors in combination and using a systems approach may allow rating systems like LEED to continue to drive market transformation toward sustainable development, while taking into consideration both energy sources and building efficiency.
Directory of Open Access Journals (Sweden)
Carolin Schneider
2013-12-01
Full Text Available The Language Centre at the University of Leeds concentrates on the full range of language training and preparation courses, both for pre-sessional and for current university students. These courses relate both to the learning of English and of foreign languages. The Self-Access Area constitutes the Language Centre’s resource library for language learning materials and supports learners on Language Centre and other modern language courses, as well as independent language learners from across the university. Catering for approximately 11,000 users, the Self-Access Area opens, on average, for 46 hours per week, with evening and Saturday opening times during term time and exam weeks. Among the services that the Self-Access Area provides are a wide range of language learning resources in print and various audiovisual formats, induction tours, an up-to-date online library catalogue and a social media presence. As part of the Language Centre, the Self- Access Area team is connected with staff and students across the university. The service also offers a range of opportunities which encourage human interaction both amongst language learners and between learners and specialists. It also acts as a flexible social and study space. The main initiatives or valuable ‘accessories’ for language learning to be discussed in this brief overview are: Language learning advising; Language exchange; and Conversation sessions
Titanium dioxide surfaces and interfaces studied using ESDIAD, LEED and STM
Cocks, I D
1998-01-01
resolved into two contributions: H atoms bonded at the oxide substrate, and the rupture of the C-H bonds of the acetate. It is proposed that acetates are bridge bonded with five-fold coordinated Ti sup 4 sup + ions, with their molecular plane perpendicular to the surface. Decomposition of acetate at room temperature occurs under electron beam radiation, desorbing CH sub 2 CO and CH sub 3 /CH sub 4. Adsorption of benzoic acid at the TiO sub 2 (110) surface is dissociative, forming benzoate and surface hydroxyls. Adsorbed benzoate is bonded with the five-fold coordinated Ti sup 4 sup + cations, forming a pseudo (2x1) overlayer at a saturation coverage of 0.5 ML. Attractive interactions between benzoate aromatic rings leads to the formation of dimerised benzoate rows along the [001] direction. TiO sub 2 surfaces have been studied by electron stimulated desorption ion angular distribution (ESDIAD), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). The TiO sub 2 (100) surface was stu...
Reynolds, E H; Broussolle, E
2018-02-01
It is well-established that Guillaume-Benjamin-Amand Duchenne de Boulogne (1806-1875), and Jean-Martin Charcot (1825-1893) were the founding fathers of Parisian and French neurology during the second half of the 19th century, although much more is known about Charcot than about his "master" Duchenne. In Britain, Thomas Clifford Allbutt (1836-1925) was Leeds' most distinguished physician of the 19th century, eventually becoming Regius Professor of Physic at Cambridge. Allbutt's 1860-1861 year of postgraduate study in Paris and his friendship with Duchenne profoundly influenced his own contributions to nervous system and mental diseases, partly in collaboration with his colleague James Crichton-Browne (1840-1938) at the nearby West Riding Lunatic Asylum in Wakefield, Yorkshire. The present report briefly recalls the careers of Duchenne and Allbutt, and also presents a unique account by Allbutt of Duchenne in action at the height of his powers, investigating and defining the previously uncharted field of neuromuscular diseases with the aid of his localized electrization techniques. This account is discussed in relation to: Duchenne's personality and pioneering neurological achievements; the origins of French neurology; and the development of Anglo-French neurological relationships during the 19th century. Interestingly, both Duchenne and Crichton-Browne separately made important and much-appreciated contributions to the third major book by Charles Darwin (1809-1882), The Expression of the Emotions in Man and Animals, published in 1872. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Carbon Footprint of Housing in the Leeds City Region - A Best Practice Scenario Analysis
Energy Technology Data Exchange (ETDEWEB)
Barrett, John; Dawkins, Elena (Stockholm Environment Inst. (Sweden))|(Univ. of York, Heslington, York YO10 5DD (United Kingdom))
2008-06-15
The Stockholm Environment Institute (SEI) was commissioned by the Environment Agency to carry out a carbon footprint analysis of the housing sector, using the Leeds City Region (LCR) as an example. The aim was to determine our ability to meet the 80 per cent by 2050 challenge of energy efficiency in the housing sector. The study relates specifically to LCR but its findings will help any planning and development teams make the right decisions and gain the resources necessary to meet carbon budgets at regional and local levels. With a growing population and an additional 263,000 housing units to be built within LCR by 2026, the housing sector would need to reduce its expected total carbon dioxide emissions by 38 million tonnes between 2010 and 2026 to be on track for 80 per cent savings in 2050. The report outlines the most detailed analysis to date of the required measures to deliver a growth-based regional housing strategy, alongside reducing carbon emissions. If the city region's new and existing housing is to attain the levels of energy efficiency necessary to deliver these carbon savings, big changes will be required in the way we build, maintain and run our homes over the next 20 years. There are pockets of good practice already in the region and the study shows that by combining innovative measures on construction standards, improvements to existing housing, low and zero carbon technologies and changing behaviour of householders, LCR can achieve the necessary savings to meet its carbon budget
Strategic energy planning within local authorities in the UK: A study of the city of Leeds
International Nuclear Information System (INIS)
Bale, Catherine S.E.; Foxon, Timothy J.; Hannon, Matthew J.; Gale, William F.
2012-01-01
This paper considers the development of a strategic energy body in a local authority in the UK and looks at the perceived need for, and possible roles of, such a body. Historically, energy provision and management has not usually been a strategic priority for UK local authorities. Yet energy considerations are implicit in key local authority responsibilities such as transport, waste management, planning, and the provision of housing services. In addition, recent UK central government policies support the move to localism and provide incentives for low-carbon energy generation. A study was undertaken to assess the potential (including both the perceived benefits and actual capacity to deliver) for Leeds City Council to develop a strategic body to execute delivery of city-level energy decision-making. We examine the perceived benefits to a range of main stakeholders, using data drawn from interviews with managers responsible for low-carbon and renewable energy projects across the city. Through participant observation we explore the capacity of a local authority to deliver a strategic energy body, and we briefly examine the possible forms of delivery. We conclude with recommendations for national policy that would enable the development of strategic energy bodies across local governments in the UK. - Highlights: ► Strategic energy planning is currently not a priority for UK local authorities. ► We present an empirical study of strategic energy planning in local authorities. ► Results from stakeholder interviews suggest support for a strategic energy body. ► We identify the capacity barriers to implementing a strategic energy body. ► We make recommendations for ways forward and support needed from national policy.
The Leeds food preference questionnaire after mild sleep restriction - A small feasibility study.
Leenaars, Cathalijn H C; Zant, Janneke C; Aussems, Audrey; Faatz, Vivian; Snackers, Daphne; Kalsbeek, Andries
2016-02-01
Besides the increased sedentary lifestyle and increased caloric intake, changes in dietary composition may play an important role in the increased prevalence of obesity. Because inadequate sleep could be a risk factor in the aetiology of obesity, reliable methods for assessing food intake and food choice after sleep restriction are needed. We translated the Leeds food preference questionnaire (LFPQ), addressing preferences for sweet/savoury tastes and low-fat/high-fat foods, into Dutch, and tested it in 15 mildly sleep-restricted psychology students. The participants completed the LFPQ in our laboratory on two separate occasions, with approximately one week in between. Sleep on the preceding night was not controlled, but mild sleep-restriction was confirmed by a short sleep latency test (sSLT) or a short maintenance of wakefulness test (sMWT). Each participant completed the sSLT and sMWT once, just before the LFPQ, in a cross-over design randomised for the first test. Differences were present in preferences for food items from different categories (sweet/savoury and low-fat/high-fat; pchoice frequencies for various food categories were comparable on both occasions (p=0.27). The choice frequencies for individual items were also comparable on both occasions (p=0.27). The LFPQ is easily implemented under mild sleep-restricted conditions, and translation is straightforward. Future studies using the LFPQ after sleep restriction could elucidate if restricting sleep or longer periods affects food choice, which could underlie increases in obesity risk. Copyright © 2015 Elsevier Inc. All rights reserved.
Do we need statistics when we have linguistics?
Directory of Open Access Journals (Sweden)
Cantos Gómez Pascual
2002-01-01
Full Text Available Statistics is known to be a quantitative approach to research. However, most of the research done in the fields of language and linguistics is of a different kind, namely qualitative. Succinctly, qualitative analysis differs from quantitative analysis is that in the former no attempt is made to assign frequencies, percentages and the like, to the linguistic features found or identified in the data. In quantitative research, linguistic features are classified and counted, and even more complex statistical models are constructed in order to explain these observed facts. In qualitative research, however, we use the data only for identifying and describing features of language usage and for providing real occurrences/examples of particular phenomena. In this paper, we shall try to show how quantitative methods and statistical techniques can supplement qualitative analyses of language. We shall attempt to present some mathematical and statistical properties of natural languages, and introduce some of the quantitative methods which are of the most value in working empirically with texts and corpora, illustrating the various issues with numerous examples and moving from the most basic descriptive techniques (frequency counts and percentages to decision-taking techniques (chi-square and z-score and to more sophisticated statistical language models (Type-Token/Lemma-Token/Lemma-Type formulae, cluster analysis and discriminant function analysis.
Directory of Open Access Journals (Sweden)
Walaa S. E. Ismaee
2016-07-01
Full Text Available The paper discusses the recent introduction of the LEED system to the Italian context in order to assess its role to promote sustainable building process in the Italian context, pointing out its potentials on one hand as well as their gaps and limitations on the other hand, and suggests means for its future development. The study discusses the application of LEED as a ‘Sustainable Project management tool’ to guide sustainable building performance. This requires investigating the following: its structure, tools, assessment criteria along with its benchmarks and references. It also discusses the application of LEED as a ‘Sustainable building Certification and market tool’. This investigates the role and value of the LEED certification in the Italian Green market. The research method is comprised of three parts. The first part is a comparative analysis of LEED categories against Italian national initiatives for sustainability. The comparison showed that most LEED categories are already mandated by national norms and directives but they may differ in their stringency creating some areas of precedence of LEED system or drawbacks. This streamlines the adaptation process of LEED system to the Italian context. The second part investigates LEED projects’ market analysis. The result showed that the shift towards a sustainable building process is occurring slowly and on a vertical scale focusing on some building sectors rather than others. Its market diffusion in the Italian context faces challenges regarding the insufficient availability of green materials and products satisfying its requirements, as well as high soft cost of sustainability tests and expertise required. The Third part presents a practical review-citing the methodology and results of a survey conducted by the researchers in mid-2012. It is composed of a web-based questionnaire and interviews among a sample of LEED professionals in Italy. The result shows that LEED systems needs
Energy Technology Data Exchange (ETDEWEB)
Fontolan, Juliana A.; Biral, Antonio Renato P., E-mail: fontolanjuliana@gmail.com.br, E-mail: biral@ceb.unicamp.br [Hospital das Clinicas (CEB/UNICAMP), Campinas, SP (Brazil). Centro de Engenharia Biomedica
2013-07-01
It is known that the distribution at time intervals of random and unrelated events leads to the Poisson distribution . This work aims to study the distribution in time intervals of events resulting from radioactive decay of atoms present in the UNICAMP where activities involving the use of ionizing radiation are performed environments . The proposal is that the distribution surveys at intervals of these events in different locations of the university are carried out through the use of a Geiger-Mueller tube . In a next step , the evaluation of distributions obtained by using non- parametric statistics (Chi- square and Kolmogorov Smirnoff) will be taken . For analyzes involving correlations we intend to use the ANOVA (Analysis of Variance) statistical tool . Measured in six different places within the Campinas , with the use of Geiger- Muller its count mode and a time window of 20 seconds was performed . Through statistical tools chi- square and Kolmogorov Smirnoff tests, using the EXCEL program , it was observed that the distributions actually refer to a Poisson distribution. Finally, the next step is to perform analyzes involving correlations using the statistical tool ANOVA.
International Nuclear Information System (INIS)
Cowen, A.R.
1986-01-01
Over the preceding decade the Leeds Radiological Imaging Group have developed a range of test objects with which to assess the performance of radiological imaging systems. The types of imaging equipment which can be assessed include X-ray image intensifier television systems, small-format 100mm/105mm fluorography systems and radiographic screen-film combinations. We have recently extended our interest to the evaluation of digital radiological imaging equipment including digital subtraction fluorography and digital (greyscale) radiographic imaging systems. These test objects were initially developed for the purpose of evaluating imaging performance under laboratory conditions but they have also proved useful under field (clinical) conditions. (author)
Osteolisis tibial secundaria a un implante ligamentoso de Leeds-Keio: Presentación de un caso
Zafra, M.A.; Ballester, J.; Román, Manuel; Carpintero Benítez, Pedro
2002-01-01
Hasta hace algunos años, el ligamento artificial de Leeds-Keio, se usó en muchos casos para la reconstrucción de lesiones del ligamento cruzado anterior. Hoy día no se usa para este tipo de lesiones debido a los pobres resultados que se observaron a medio y largo plazo. No obstante, su uso está indicado en otras lesiones, como son la reconstrucción del aparato extensor de la rodilla, y en inestabilidades del hombro y de la columna vertebral. Presentamos un caso de osteolisis masiva del platil...
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked
Susanti, Yuliana; Zukhronah, Etik; Pratiwi, Hasih; Respatiwulan; Sri Sulistijowati, H.
2017-11-01
To achieve food resilience in Indonesia, food diversification by exploring potentials of local food is required. Corn is one of alternating staple food of Javanese society. For that reason, corn production needs to be improved by considering the influencing factors. CHAID and CRT are methods of data mining which can be used to classify the influencing variables. The present study seeks to dig up information on the potentials of local food availability of corn in regencies and cities in Java Island. CHAID analysis yields four classifications with accuracy of 78.8%, while CRT analysis yields seven classifications with accuracy of 79.6%.
Brickham, Dana M.
2012-01-01
People with alcohol abuse/dependence disabilities are often faced with a complex recovery process due to the exacerbating and chronic aspects of their condition. Vocational rehabilitation for people with alcohol abuse/dependence can help individuals access and maintain employment, and through employment can enhance physical and psychological…
Nikooienejad, Amir; Johnson, Valen E.
2018-01-01
Uniformly most powerful Bayesian tests (UMPBTs) are an objective class of Bayesian hypothesis tests that can be considered the Bayesian counterpart of classical uniformly most powerful tests. Unfortunately, UMPBTs have only been exposed for application in one parameter exponential family models. The purpose of this article is to describe methodology for deriving UMPBTs for a larger class of tests. Specifically, we introduce sufficient conditions for the existence of UMPBTs and propose a unifi...
Single-layer ZnS supported on Au(111): A combined XPS, LEED, STM and DFT study
Deng, Xingyi; Sorescu, Dan C.; Lee, Junseok
2017-04-01
Single-layer of ZnS, consisting of one atomic layer of ZnS(111) plane, has been grown on Au(111) and characterized using X-ray photoelectron spectroscopy (XPS), low energy electron diffraction (LEED) and scanning tunneling microscopy (STM). While the LEED measurement indicates a coincidence structure of ZnS-(3×3)/Au(111)-(4×4), high resolution STM images reveal hexagonal unit cells of 6.7×6.7 Å2 and 11.6×11.6 Å2, corresponding to √3 and 3 times the unit cell of the ideal zincblende ZnS-(1×1), respectively, depending on the tunneling conditions. Calculations based on density functional theory (DFT) indicate a significantly reconstructed non-planar structure of ZnS single-layer on Au(111) with 2/3 of the S anions being located nearly in the plane of the Zn cations and the rest 1/3 of the S anions protruding above the Zn plane. The calculated STM image shows similar characteristics to those of the experimental STM image. Additionally, the DFT calculations reveal the different bonding nature of the S anions in ZnS single-layer supported on Au(111).
ADOÇÃO DA CERTIFICAÇÃO LEED EM MEIOS DE HOSPEDAGEM: ESVERDEANDO A HOTELARIA?
Directory of Open Access Journals (Sweden)
Mirna de Lima Medeiros
2012-03-01
Full Text Available The research intended to analyze the adoption process of the green certification “Leadership in Energy and Environmental Design” (LEED from the hotel sector establishments that has already adopted it. For its concretization it was proceeded a bibliographical research, secondary fact-gathering in journals, institutional sites and documentaries, and primary fact-gathering by means of semi structured interviews carried out with responsible people of the certified hotels and of the responsible entity of the certification in Brazil (Green Building Council Brazil. There were 21 interviewee, being 02 of the GBC Brazil and 19 of means of lodging (31% of the certified. For data analysis, it was utilized content analysis technique with the aid of ATLAS.ti software. The results permitted to identify the chronology of the processes of certification and the profile of the hotel categories that adopt the LEED program. Beyond that, the interviews enabled the discussion of the initial motivations for seeking the certification, as well the advantages and the obstacles perceived regarding its adoption.
Energy Technology Data Exchange (ETDEWEB)
Sabapathy, Ashwin; Ragavan, Santhosh K.V.; Vijendra, Mahima; Nataraja, Anjana G. [Enzen Global Solutions Pvt Ltd, 90, Hosur Road, Madiwala, Bangalore 560 068 (India)
2010-11-15
This paper provides a summary of an energy benchmarking study that uses performance data of a sample of Information Technology facilities in Bangalore. Information provided by the sample of occupiers was used to develop an Energy Performance Index (EPI) and an Annual Average hourly Energy Performance Index (AAhEPI), which takes into account the variations in operation hours and days for these facilities. The EPI and AAhEPI were modelled to identify the factors that influence energy efficiency. Employment density, size of facility, operating hours per week, type of chiller and age of facility were found to be significant factors in regression models with EPI and AAhEPI as dependent variables. Employment density, size of facility and operating hours per week were standardised and used in a separate regression analysis. Parameter estimates from this regression were used to normalize the EPI and AAhEPI for variance in the independent variables. Three benchmark ranges - the bottom third, middle third and top third - were developed for the two normalised indices. The normalised EPI and AAhEPI of LEED rated building, which were also part of the sample, indicate that, on average, LEED rated buildings outperform the other buildings. (author)
Feiten, F.; Kuhlenbeck, H.; Freund, H.
2016-01-01
The (0001) surface of vanadium sesquioxide, V2O3, is terminated by vanadyl groups under standard ultra high vacuum preparation conditions. Reduction with electrons results in a chemically highly active surface with a well-defined LEED pattern indicating a high degree of order. In this work we report the first quantitative structure determination of a reduced V2O3(0001) surface. We identify two distinct surface phases by STM, one well ordered and one less well ordered. I/V-LEED shows the order...
Potential errors and misuse of statistics in studies on leakage in endodontics.
Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J
2013-04-01
To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.
A simulation study for comparing testing statistics in response-adaptive randomization.
Gu, Xuemin; Lee, J Jack
2010-06-05
Response-adaptive randomizations are able to assign more patients in a comparative clinical trial to the tentatively better treatment. However, due to the adaptation in patient allocation, the samples to be compared are no longer independent. At large sample sizes, many asymptotic properties of test statistics derived for independent sample comparison are still applicable in adaptive randomization provided that the patient allocation ratio converges to an appropriate target asymptotically. However, the small sample properties of commonly used test statistics in response-adaptive randomization are not fully studied. Simulations are systematically conducted to characterize the statistical properties of eight test statistics in six response-adaptive randomization methods at six allocation targets with sample sizes ranging from 20 to 200. Since adaptive randomization is usually not recommended for sample size less than 30, the present paper focuses on the case with a sample of 30 to give general recommendations with regard to test statistics for contingency tables in response-adaptive randomization at small sample sizes. Among all asymptotic test statistics, the Cook's correction to chi-square test (TMC) is the best in attaining the nominal size of hypothesis test. The William's correction to log-likelihood ratio test (TML) gives slightly inflated type I error and higher power as compared with TMC, but it is more robust against the unbalance in patient allocation. TMC and TML are usually the two test statistics with the highest power in different simulation scenarios. When focusing on TMC and TML, the generalized drop-the-loser urn (GDL) and sequential estimation-adjusted urn (SEU) have the best ability to attain the correct size of hypothesis test respectively. Among all sequential methods that can target different allocation ratios, GDL has the lowest variation and the highest overall power at all allocation ratios. The performance of different adaptive randomization
Statistical Analysis And Treatment Of Accident Black Spots: A Case Study Of Nandyal Mandal
Sudharshan Reddy, B.; Vishnu Vardhan Reddy, L.; Sreenivasa Reddy, G., Dr
2017-08-01
Background: Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. Nandyal Mandal is located in the Kurnool district of Andhra Pradesh and well developed in both agricultural and industrial sectors after Kurnool. 567 accidents occurred in the last seven years at 143 locations shows the severity of the accidents in the Nandyal Mandal. There is a need to carry out some work in the Nandyal Mandal to improve the accidents black spots for reducing the accidents. Methods: Last seven years (2010-2016) of accident data collected from Police Stations. Weighted Severity Index (WSI), a scientific method is used for identifying the accident black spots. Statistical analysis has carried out for the collected data using Chi-Square Test to determine the independence of accidents with other attributes. Chi-Square Goodness of fit test conducted for test whether the accidents are occurring by chance or following any pattern. Results: WSI values are determined for the 143 locations. The Locations with high WSI are treated as accident black spots. Five black spots are taken for field study. After field observations and interaction with the public, some improvements are suggested for improving the accident black spots. There is no relationship between the severity of accidents and the other attributes like month, season, day, hours in day and the age group except type of vehicle. Road accidents are distributed throughout the Year, Month and Season. Road accidents are not distributed throughout the day.
Czech Academy of Sciences Publication Activity Database
Romanyuk, Olexandr; Hattori, K.; Someta, M.; Daimon, H.
2014-01-01
Roč. 90, č. 15 (2014), "155305-1"-"155305-9" ISSN 1098-0121 Grant - others:AVČR(CZ) M100101201; Murata Science Foundation(JP) Project n. 00295 Institutional support: RVO:68378271 Keywords : iron silicide * LEED I-V * DFT * STM * surface reconstruction * surface states Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.736, year: 2014
National Research Council Canada - National Science Library
Carpenter, Deanna S
2005-01-01
...: 1 CD-ROM; 4 3/4 in.; 484 KB. ABSTRACT: This research study focused on determining the effects that the two major contract delivery methods had on the LEED score of projects over the design and construction time horizon...
Energy Provisions of the ICC-700, LEED for Homes, and ENERGY STAR Mapped to the 2009 IECC
Energy Technology Data Exchange (ETDEWEB)
Britt, Michelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sullivan, Robin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kora, Angela R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Makela, Eric J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Makela, Erin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2011-05-01
This document provides the results of a comparison of building energy efficient elements of the ICC-700 National Green Building Standard, LEED for Homes, and ENERGY STAR versions 2, 2.5, and 3.0 to the 2009 International Energy Conservation Code (2009 IECC). This comparison will provide a tool for states and local municipalities as they consider adoption of these programs. The comparison is presented in a series of appendices. The first appendix provides a summary chart that visually represents the comprehensive comparison of the programs to the 2009 IECC topic areas. Next there are a series of individual tables (one appendix for each program) that include the specific program mapping to the 2009 IECC elements with comments that briefly discuss how well the elements mapped. Finally, a comprehensive table is included that shows all five of the programs mapped to the 2009 IECC elements to allow a detailed comparison.
Directory of Open Access Journals (Sweden)
Ian Law
2013-04-01
Full Text Available Despite increasing understanding of, information about and official commitment to challenge these patterns, racist hostility and violence continue to have an enduring presence in urban and rural life in the UK. This indicates the paradoxical nature of this racial crisis and challenges for antiracism as a political project. This paper charts how these issues play out at the local level through an examination of a five year process from problem identification through to research, response, action and aftermath from 2006 to 2012 in the city of Leeds, UK, with a focus on two predominantly white working class social housing estates in the city. We explore how embedded tensions and antagonisms can begin to be challenged, while examining how the contemporary climate of austerity and cuts in services, together with prevailing post-racial thinking, make the likelihood of such concerted action in the UK increasingly remote.
Directory of Open Access Journals (Sweden)
Murat Kucukvar
2016-01-01
Full Text Available The current waste management literature lacks a comprehensive LCA of the recycling of construction materials that considers both process and supply chain-related impacts as a whole. Furthermore, an optimization-based decision support framework has not been also addressed in any work, which provides a quantifiable understanding about the potential savings and implications associated with recycling of construction materials from a life cycle perspective. The aim of this research is to present a multi-criteria optimization model, which is developed to propose economically-sound and environmentally-benign construction waste management strategies for a LEED-certified university building. First, an economic input-output-based hybrid life cycle assessment model is built to quantify the total environmental impacts of various waste management options: recycling, conventional landfilling and incineration. After quantifying the net environmental pressures associated with these waste treatment alternatives, a compromise programming model is utilized to determine the optimal recycling strategy considering environmental and economic impacts, simultaneously. The analysis results show that recycling of ferrous and non-ferrous metals significantly contributed to reductions in the total carbon footprint of waste management. On the other hand, recycling of asphalt and concrete increased the overall carbon footprint due to high fuel consumption and emissions during the crushing process. Based on the multi-criteria optimization results, 100% recycling of ferrous and non-ferrous metals, cardboard, plastic and glass is suggested to maximize the environmental and economic savings, simultaneously. We believe that the results of this research will facilitate better decision making in treating construction and debris waste for LEED-certified green buildings by combining the results of environmental LCA with multi-objective optimization modeling.
DEFF Research Database (Denmark)
Moore, Rod; Bering, Peter
2017-01-01
Abstract Background : Survey quality, in particular sampling, coverage, and issues of representativity, are important for valid and reliable conclusions from epidemiological data. Dental anxiety (DA) still challenges dental clinicians since it is synonymous with care avoidance. Accurate estimates...... of DA are important for public health. Aims were to 1) assess demographic representativity (age/ gender) of a 2013-14 web survey and a 1992-93 telephone survey about DA in Danish adults aged 16 - 80 yr using government statistics; 2) assess DA frequency and characteristics from web survey data (N = 701......); and 3) compare web results with 1993 results. Method: Dental Anxiety Scale (DAS) measured DA, while other items revealed gender, age, education, dentist avoidance, and three types of negative dentist behaviors. Analyses used frequencies, Chi square, odds ratios (OR) and ANOVAs. Results: Samples from...
Cosmic Statistics of Statistics
Szapudi, I.; Colombi, S.; Bernardeau, F.
1999-01-01
The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...
... Certification Import Surveillance International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS ...
Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.
Chalmers, R Philip
2017-08-22
This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
South, J; Kime, NH
2008-01-01
The rise in childhood obesity is a major public health challenge and a national priority for health action. Obesity is associated with many illnesses and is directly related to increased mortality and lower life expectancy. The Children’s Plan recognises child obesity as one of the most serious challenges for children and links it to a number of poor outcomes, physical, social and psychological (Department for Children, Schools and Families 2007). ‘Can’t wait to be healthy’- Leeds Childhood O...
STATISTICS, Program System for Statistical Analysis of Experimental Data
International Nuclear Information System (INIS)
Helmreich, F.
1991-01-01
1 - Description of problem or function: The package is composed of 83 routines, the most important of which are the following: BINDTR: Binomial distribution; HYPDTR: Hypergeometric distribution; POIDTR: Poisson distribution; GAMDTR: Gamma distribution; BETADTR: Beta-1 and Beta-2 distributions; NORDTR: Normal distribution; CHIDTR: Chi-square distribution; STUDTR : Distribution of 'Student's T'; FISDTR: Distribution of F; EXPDTR: Exponential distribution; WEIDTR: Weibull distribution; FRAKTIL: Calculation of the fractiles of the normal, chi-square, Student's, and F distributions; VARVGL: Test for equality of variance for several sample observations; ANPAST: Kolmogorov-Smirnov test and chi-square test of goodness of fit; MULIRE: Multiple linear regression analysis for a dependent variable and a set of independent variables; STPRG: Performs a stepwise multiple linear regression analysis for a dependent variable and a set of independent variables. At each step, the variable entered into the regression equation is the one which has the greatest amount of variance between it and the dependent variable. Any independent variable can be forced into or deleted from the regression equation, irrespective of its contribution to the equation. LTEST: Tests the hypotheses of linearity of the data. SPRANK: Calculates the Spearman rank correlation coefficient. 2 - Method of solution: VARVGL: The Bartlett's Test, the Cochran's Test and the Hartley's Test are performed in the program. MULIRE: The Gauss-Jordan method is used in the solution of the normal equations. STPRG: The abbreviated Doolittle method is used to (1) determine variables to enter into the regression, and (2) complete regression coefficient calculation. 3 - Restrictions on the complexity of the problem: VARVGL: The Hartley's Test is only performed if the sample observations are all of the same size
Directory of Open Access Journals (Sweden)
Stephen Verderber
2011-03-01
Full Text Available Hurricane Katrina displaced nearly one million citizens from the New Orleans metro region in 2005. Five years after the catastrophe, in August of 2010, more than 150,000 citizens remained scattered across the United States. Katrina was the largest Diaspora in the nation’s history. The number of homes damaged or destroyed by Katrina’s devastation numbered more than 125,000. An award-winning case study is presented of a unique partnership forged between academia, a local social service agency, professional architectural and engineering firms, and a national humanitarian aid organization whose mission is to provide affordable housing for homeless persons in transition. This collaboration resulted in a sustainable design/build project that originated in a research-based university design studio. The facility is a 38-bed family shelter for homeless mothers and their children seeking to rebuild their lives in post-Katrina New Orleans. The site for this 4,400 facility did not flood when the city’s federally built levee system failed in 2005. This case study is presented from its inception, to programming and design, construction, occupancy, and the postoccupancy assessment of the completed building. This facility is the first LEED certified (Silver building in New Orleans. Project limitations, lessons learned, and recommendations for future initiatives of this type are discussed, particularly in the context of any inner urban community coping with the aftermath of an urban disaster.
Grady, Maxwell; Diaconescu, Bogdan; Valovcin, Darren; Hagelberg, Frank; Pohl, Karsten
2013-03-01
Graphene has aroused tremendous interest due to its remarkable electronic and mechanical properties. Graphene's optical properties, conductance, and the fact that it can be transferred to many substrates make it an ideal candidate for use in nanoelectronic devices and organic photoelectric devices. The lack of a bandgap, however, causes a serious challenge for implementing graphene as a material for electrical switches and therefore creative ways of inducing this bandgap are needed. We will present a STM/LEED/DFT study of the single layer graphene on Ru(0001) system in the presence of hydrogen. Structural studies show arrays of moiré superlattices with sizes ranging from 0.9 to 3.0 nm in the presence of hydrogen on the compact surface of ruthenium. First principle calculations help explain the appearance of these arrays of graphene reconstructions driven by the H presence at the Ru(0001) interface, and furthermore, predict the appearance of a bandgap with values correlated with the moiré superstructure sizes in the presence of hydrogen. Control over moiré superstructure size can aid in future work using graphene as a nanotemplate for self assembled growth of nanoelectronic devices an organic photovoltaics. This work was supported by the Nanoscale Science and Engineering Center for High-rate Nanomanufacturing (NSF NSEC-425826) and NSF DMR-1006863
Bedi, K K; Hakeem, A R; Dave, R; Lewington, A; Sanfey, H; Ahmad, N
2015-03-01
The shortage of organ donors is the key rate-limiting factor for organ transplantation in the United Kingdom. Many strategies have been proposed to increase donation; one strategy aims to improve awareness of organ donation and transplantation (ODT) among medical students. This survey seeks to investigate the knowledge, perceptions, and attitudes of the medical students in the United Kingdom toward ODT and the curriculum content. A 32-item online questionnaire was distributed to 957 medical students at the University of Leeds (October to December 2012). There were 216 (22.6%) respondents. Students were aware of kidney, heart, and liver transplantation (91.6%, 88.8%, and 86.5%). Awareness of small intestine (36.7%) and islet of Langerhans (33.0%) transplantation was poor. Students understood the term "brain stem death" (82.3%); however, they lacked understanding of criteria used for brain stem death testing (75.8%). Their perceptions and attitudes were favorable toward ODT; 43.3% of the students were unhappy with their current knowledge, and 87.6% of the students agree that ODT teaching should be included in the curriculum. Students have a basic understanding of ODT but lack detailed knowledge. They accept its importance and desire further teaching to supplement their current knowledge to be able to understand the issues related to ODT. Copyright © 2015 Elsevier Inc. All rights reserved.
Aota, Shigeo; Kikuchi, Shin-Ichi; Ohashi, Hironori; Kitano, Naoko; Hakozaki, Michiyuki; Konno, Shin-Ichi
2017-10-16
Since dislocation after total hip arthroplasty (THA) greatly diminishes patient's quality of life, the THA frequently needs revision. However, it is common for the dislocation not to heal even after reconstruction, but rather to become intractable. The 17 patients with dislocated THA, mean age of 71 years (range 51-87 years), who underwent a revision THA together with soft tissue reinforcement with a Leeds-Keio (LK) ligament were enrolled. The purposes of reinforcement with LK ligament were to restrict the internal rotation of the hip joint, and to encourage the formation of fibrous tissue in the posterior acetabular wall to stabilise the femoral head. We determined the success rate of surgical treatment for dislocation, the Harris Hip Score (HHS), a factor of recurrent dislocation. There was no recurrent dislocation in 82% of the cases (14 joints) during the mean postoperative follow-up period of 63.5 months (15-96 months). The HHS was 82 ± 18 points preoperatively and 82 ± 14 points postoperatively. Recurrent dislocation after this surgical procedure occurred in 2 hips with breakage of the LK ligaments, and intracapsular dislocation in 1 hip with loosening of the LK ligament. Although the risk of recurrent dislocation still exists with this procedure, when performed to provide reinforcement with an LK ligament for dislocated THA it may be useful in intractable cases with soft tissue defects around the hip joint.
Directory of Open Access Journals (Sweden)
Yolanda Escalante
2012-09-01
Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots
International Nuclear Information System (INIS)
Gomez, Miryam; Saldarriaga, Julio; Correa, Mauricio; Posada, Enrique; Castrillon M, Francisco Javier
2007-01-01
Sand fields, constructions, carbon boilers, roads, and biologic sources are air-contaminant-constituent factors in down town Valle de Aburra, among others. the distribution of road contribution data to total suspended particles according to the source receptor model MCF, source correlation modeling, is nearly a gamma distribution. Chi-square goodness of fit is used to model statistically. This test for goodness of fit also allows estimating the parameters of the distribution utilizing maximum likelihood method. As convergence criteria, the estimation maximization algorithm is used. The mean of road contribution data to total suspended particles according to the source receptor model MCF, is straightforward and validates the road contribution factor to the atmospheric pollution of the zone under study
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
... Coping with Alzheimer’s COPD Caregiving Take Care! Caregiver Statistics Statistics on Family Caregivers and Family Caregiving Caregiving Population ... Health Care Caregiver Self-Awareness State by State Statistics Caregiving Population The value of the services family ...
Statistical analysis of NaOH pretreatment effects on sweet sorghum bagasse characteristics
Putri, Ary Mauliva Hada; Wahyuni, Eka Tri; Sudiyani, Yanni
2017-01-01
We analyze the behavior of sweet sorghum bagasse characteristics before and after NaOH pretreatments by statistical analysis. These characteristics include the percentages of lignocellulosic materials and the degree of crystallinity. We use the chi-square method to get the values of fitted parameters, and then deploy student's t-test to check whether they are significantly different from zero at 99.73% confidence level (C.L.). We obtain, in the cases of hemicellulose and lignin, that their percentages after pretreatment decrease statistically. On the other hand, crystallinity does not possess similar behavior as the data proves that all fitted parameters in this case might be consistent with zero. Our statistical result is then cross examined with the observations from X-ray diffraction (XRD) and Fourier Transform Infrared (FTIR) Spectroscopy, showing pretty good agreement. This result may indicate that the 10% NaOH pretreatment might not be sufficient in changing the crystallinity index of the sweet sorghum bagasse.
Statistical evaluation of adding multiple risk factors improves Framingham stroke risk score.
Zhou, Xiao-Hua; Wang, Xiaonan; Duncan, Ashlee; Hu, Guizhou; Zheng, Jiayin
2017-04-14
Framingham Stroke Risk Score (FSRS) is the most well-regarded risk appraisal tools for evaluating an individual's absolute risk on stroke onset. However, several widely accepted risk factors for stroke were not included in the original Framingham model. This study proposed a new model which combines an existing risk models with new risk factors using synthesis analysis, and applied it to the longitudinal Atherosclerosis Risk in Communities (ARIC) data set. Risk factors in original prediction models and new risk factors in proposed model had been discussed. Three measures, like discrimination, calibration and reclassification, were used to evaluate the performance of the original Framingham model and new risk prediction model. Modified C-statistics, Hosmer-Lemeshow Test and classless NRI, class NRI were the statistical indices which, respectively, denoted the performance of discrimination, calibration and reclassification for evaluating the newly developed risk prediction model on stroke onset. It showed that the NEW-STROKE (new stroke risk score prediction model) model had higher modified C-statistics, smaller Hosmer-Lemeshow chi-square values after recalibration than original FSRS model, and the classless NRI and class NRI of the NEW-STROKE model over the original FSRS model were all significantly positive in overall group. The NEW-STROKE integrated with seven literature-derived risk factors outperformed the original FSRS model in predicting the risk score of stroke. It illustrated that seven literature-derived risk factors contributed significantly to stroke risk prediction.
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
Energy Technology Data Exchange (ETDEWEB)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.
Saavedra, Jose M; Escalantel, Yolanda; Madera, Joaquin; Mansilla, Mirella; García-Hermoso, Antonio
2014-09-01
The aims of this study were (i) to compare water polo game-related statistics by game outcome (winning and losing teams) and margins of victory (close games, unbalanced games, and very unbalanced games), and (ii) to identify characteristics that mark the differences in performances for each group of margin of victory. The game-related statistics of the 308 men's matches played in seven International Championships (Olympic Games, World and European Championships) were analysed. A cluster analysis established three groups (close games, unbalanced games, and very unbalanced games) according to the margin of victory. Differences between game outcomes (winning or losing teams) and margins of victory (close, unbalanced, and very unbalanced games) were determined using the chi-squared statistic, also calculating the effect sizes of the differences. A discriminant analysis was then performed applying the sample-splitting method according to game outcome (winning and losing teams) by margin of victory. It was found that the game-related statistics differentiate the winning from the losing teams in each final score group, with 7 (offensive and defensive) variables differentiating winners from losers in close games, 16 in unbalanced games, and 11 in very unbalanced games. In all three types of game, the game-related statistics were shown to discriminate performance (85% or more), with two variables being discriminatory by game outcome (winning or losing teams) in all three cases: shots and goalkeeper-blocked shots.
Directory of Open Access Journals (Sweden)
Elise eVaumourin
2014-05-01
Full Text Available A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e. the generalized chi-square, the network and the multinomial GLM approaches to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: 1 rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and 2 bovine population infected with Theileria sp. and Babesia sp.. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unravelling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
To be certain about the uncertainty: Bayesian statistics for 13 C metabolic flux analysis.
Theorell, Axel; Leweke, Samuel; Wiechert, Wolfgang; Nöh, Katharina
2017-11-01
13 C Metabolic Fluxes Analysis ( 13 C MFA) remains to be the most powerful approach to determine intracellular metabolic reaction rates. Decisions on strain engineering and experimentation heavily rely upon the certainty with which these fluxes are estimated. For uncertainty quantification, the vast majority of 13 C MFA studies relies on confidence intervals from the paradigm of Frequentist statistics. However, it is well known that the confidence intervals for a given experimental outcome are not uniquely defined. As a result, confidence intervals produced by different methods can be different, but nevertheless equally valid. This is of high relevance to 13 C MFA, since practitioners regularly use three different approximate approaches for calculating confidence intervals. By means of a computational study with a realistic model of the central carbon metabolism of E. coli, we provide strong evidence that confidence intervals used in the field depend strongly on the technique with which they were calculated and, thus, their use leads to misinterpretation of the flux uncertainty. In order to provide a better alternative to confidence intervals in 13 C MFA, we demonstrate that credible intervals from the paradigm of Bayesian statistics give more reliable flux uncertainty quantifications which can be readily computed with high accuracy using Markov chain Monte Carlo. In addition, the widely applied chi-square test, as a means of testing whether the model reproduces the data, is examined closer. © 2017 Wiley Periodicals, Inc.
Strategies towards statistically robust interpretations of in situ U–Pb zircon geochronology
Directory of Open Access Journals (Sweden)
Christopher J. Spencer
2016-07-01
Full Text Available Zircon U–Pb geochronology has become a keystone tool across Earth science, arguably providing the gold standard in resolving deep geological time. The development of rapid in situ analysis of zircon (via laser ablation and secondary ionization mass spectrometry has allowed for large amounts of data to be generated in a relatively short amount of time and such large volume datasets offer the ability to address a range of geological questions that would otherwise remain intractable (e.g. detrital zircons as a sediment fingerprinting method. The ease of acquisition, while bringing benefit to the Earth science community, has also led to diverse interpretations of geochronological data. In this work we seek to refocus U–Pb zircon geochronology toward best practice by providing a robust statistically coherent workflow. We discuss a range of data filtering approaches and their inherent limitations (e.g. discordance and the reduced chi-squared; MSWD. We evaluate appropriate mechanisms to calculate the most geologically appropriate age from both 238U/206Pb and 207Pb/206Pb ratios and demonstrate the cross over position when chronometric power swaps between these ratios. As our in situ analytical techniques become progressively more precise, appropriate statistical handing of U–Pb datasets will become increasingly pertinent.
Performance analysis of Wald-statistic based network detection methods for radiation sources
Energy Technology Data Exchange (ETDEWEB)
Sen, Satyabrata [ORNL; Rao, Nageswara S [ORNL; Wu, Qishi [University of Memphis; Barry, M. L.. [New Jersey Institute of Technology; Grieme, M. [New Jersey Institute of Technology; Brooks, Richard R [ORNL; Cordone, G. [Clemson University
2016-01-01
There have been increasingly large deployments of radiation detection networks that require computationally fast algorithms to produce prompt results over ad-hoc sub-networks of mobile devices, such as smart-phones. These algorithms are in sharp contrast to complex network algorithms that necessitate all measurements to be sent to powerful central servers. In this work, at individual sensors, we employ Wald-statistic based detection algorithms which are computationally very fast, and are implemented as one of three Z-tests and four chi-square tests. At fusion center, we apply the K-out-of-N fusion to combine the sensors hard decisions. We characterize the performance of detection methods by deriving analytical expressions for the distributions of underlying test statistics, and by analyzing the fusion performances in terms of K, N, and the false-alarm rates of individual detectors. We experimentally validate our methods using measurements from indoor and outdoor characterization tests of the Intelligence Radiation Sensors Systems (IRSS) program. In particular, utilizing the outdoor measurements, we construct two important real-life scenarios, boundary surveillance and portal monitoring, and present the results of our algorithms.
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Feiten, Felix E; Kuhlenbeck, Helmut; Freund, Hans-Joachim
2016-01-28
The (0001) surface of vanadium sesquioxide, V2O3, is terminated by vanadyl groups under standard ultra high vacuum preparation conditions. Reduction with electrons results in a chemically highly active surface with a well-defined LEED pattern indicating a high degree of order. In this work we report the first quantitative structure determination of a reduced V2O3(0001) surface. We identify two distinct surface phases by STM, one well ordered and one less well ordered. I/V-LEED shows the ordered phase to be terminated by a single vanadium atom per surface unit cell on a quasi-hexagonal oxygen layer with three atoms per two-dimensional unit cell. Furthermore we compare the method of surface reduction via electron bombardment with the deposition of V onto a vanadyl terminated film. The latter procedure was previously proposed to result in a structure with three surface vanadium atoms in the 2D unit cell and we confirm this with simulated STM images.
Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.
2016-03-01
Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.
Directory of Open Access Journals (Sweden)
Saadia Gamir
2017-06-01
Full Text Available Various newspaper articles report that British ministers, university representatives, exam chiefs and business bodies agree that foreign languages skills in primary, secondary and tertiary UK education are in crisis. Lower funding and policy changes have caused language skills deficiencies felt gravely in the business sectors. Funding and support initiatives pledged by policy makers appear to be election-driven, barely outliving newly elected governments. Others blame secondary school language curriculum for failing to inspire students to take up a language when they reach 13 or 14. Others still argue that severe A-level examinations marking deters students from taking up a foreign language at 6th form level, producing fewer prospective language learners for university departments. Community languages are also undervalued as small-entry languages could soon be axed from GCSE and A-level examinations. In a world increasingly interconnected, it is essential the importance of language learning be reinstated in all our educational institutions. This paper reviews two decades of the conditions of language provision in the UK in general, with an emphasis on Leeds Beckett University. It also attempts to answer two questions emerging form the author’s personal teaching experience and reflections: What are the realities and challenges language teaching faces at Leeds Beckett University? And, how may we support language learners in fulfilling their ambition to acquire the required skills to communicate effectively in this globalised world?
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... by accounting for the significance of the materials and the equipment that enters into the production of statistics. Key words: Reversible statistics, diverse materials, constructivism, economics, science, and technology....
Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.
2012-01-01
The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID
Which statistics should tropical biologists learn?
Directory of Open Access Journals (Sweden)
Natalia Loaiza Velásquez
2011-09-01
Full Text Available Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA, Chi-Square Test, Student’s T Test, Linear Regression, Pearson’s Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon’s Diversity Index, Tukey’s Test, Cluster Analysis, Spearman’s Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements. Rev. Biol. Trop. 59 (3: 983-992. Epub 2011 September 01.Los biólogos tropicales estudian la biodiversidad más rica y amenazada del planeta, y en estos tiempos de cambio climático y mega-extinción, la necesidad de investigación de buena calidad es más acuciante que en el pasado. Sin embargo, el componente estadístico en la investigación publicada por los autores tropicales adolece a veces
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Development of an unbiased statistical method for the analysis of unigenic evolution
Directory of Open Access Journals (Sweden)
Shilton Brian H
2006-03-01
Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Lyons, L.
2017-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses. Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing...
Hagell, Peter; Westergren, Albert
Sample size is a major factor in statistical null hypothesis testing, which is the basis for many approaches to testing Rasch model fit. Few sample size recommendations for testing fit to the Rasch model concern the Rasch Unidimensional Measurement Models (RUMM) software, which features chi-square and ANOVA/F-ratio based fit statistics, including Bonferroni and algebraic sample size adjustments. This paper explores the occurrence of Type I errors with RUMM fit statistics, and the effects of algebraic sample size adjustments. Data with simulated Rasch model fitting 25-item dichotomous scales and sample sizes ranging from N = 50 to N = 2500 were analysed with and without algebraically adjusted sample sizes. Results suggest the occurrence of Type I errors with N less then or equal to 500, and that Bonferroni correction as well as downward algebraic sample size adjustment are useful to avoid such errors, whereas upward adjustment of smaller samples falsely signal misfit. Our observations suggest that sample sizes around N = 250 to N = 500 may provide a good balance for the statistical interpretation of the RUMM fit statistics studied here with respect to Type I errors and under the assumption of Rasch model fit within the examined frame of reference (i.e., about 25 item parameters well targeted to the sample).
Tengeler, Sven; Kaiser, Bernhard; Ferro, Gabriel; Chaussende, Didier; Jaegermann, Wolfram
2018-01-01
The (001) surface of cubic silicon carbide (3C SiC) after cleaning, Ar sputtering and three different wet chemical etching procedures was thoroughly investigated via (angle resolved) XPS, HREELS, and LEED. While Ar sputtering was found to be unsuitable for surface preparation, all three employed wet chemical etching procedures (piranha/NH4F, piranha/HF, and RCA) provide a clean surface. HF as oxide removal agent tends to result in fluorine traces on the sample surface, despite thorough rinsing. All procedures yield a 1 × 1 Si-OH/C-H terminated surface. However, the XPS spectra reveal some differences in the resulting surface states. NH4F for oxide removal produces a flat band situation, whereas the other two procedures result in a slight downward (HF) or upward (RCA) band bending. Because the band bending is small, it can be concluded that the number of unsaturated surface defects is low.
Vidali, G.; Li, W.; Dowben, P. A.; Karimi, M.; Hutchings, C. W.; Lin, J.; Moses, C.; Ila, D.; Dalins, I.
1990-01-01
We used ABS, LEED and angle-resolved photo-electron spectroscopy (ARPES) to study bilayer films of Hg on Cu(001). In the surface temperature range of 180 to 330 K, the first Hg layer forms two ordered phases, a c(2x2) (with coverage-0.5 of Cu(001)) and a high density (partially commensurate) c(4x4) (coverage-0.62). ARPES data show that there is little or no dispersion of the 5d band of Hg. ABS data show that this layer is not flat, with in-registry Hg atoms lying about 0.15 below the not-in-registry Hg atoms. From ABS we find that the second layer forms a completely registered c(4x4) phase. From ARPES we obtain that the second layer has an electronic structure, particularly the 5d levels, characteristic of bulk mercury. Preliminary results of calculations of the structure of the bilayer are given.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Feist, Amber M.
2013-01-01
Hispanic women who are deaf constitute a heterogeneous group of individuals with varying vocational needs. To understand the unique needs of this population, it is important to analyze how consumer characteristics, presence of public supports, and type of services provided influence employment outcomes for Hispanic women who are deaf. The purpose…
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 10. Statistical Computing - Understanding Randomness and Random Numbers. Sudhakar Kunte. Series Article Volume 4 Issue 10 October 1999 pp 16-21. Fulltext. Click here to view fulltext PDF. Permanent link:
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
International Nuclear Information System (INIS)
Guha, S.; Taylor, J.H.
1996-01-01
It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Reducing vulnerability of modified LSB algorithm to a chosen statistic attacks
Directory of Open Access Journals (Sweden)
Kamil Kaczyński
2014-12-01
Full Text Available [b]Abstract[/b]. The LSB algorithm is one of the most studied steganographic algorithms. There are several types of attacks that can detect the fact of conducting cover communication — chi-square attack and RS. This paper presents modification of the LSB algorithm which introduces fewer changes to carrier than the original LSB algorithm. Modified algorithms use a compression function, which significantly hinders the detection process. This paper also includes a description of main steganalytic methods along with their application to the proposed modification of the LSB algorithm.[b]Keywords[/b]: steganography, cyclic code, error correction codes, LSB, BCH, chi-square, steganalysis
Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions
International Nuclear Information System (INIS)
Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.
2010-01-01
A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Sheffield, Scott
2009-01-01
In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.
Lundquist, Marjorie
2009-03-01
Since 1999 over 425 in-custody deaths have occurred in the USA after law enforcement officers (LEOs) used an M26 or X26 Taser, causing Amnesty International and the ACLU to call for a moratorium on Taser use until its physiological effects on people have been better studied. A person's Taser dose is defined as the total duration (in seconds) of all Taser shocks received by that person during a given incident. Utilizing the concept of Taser dose for these deaths, TASER International's claim of Taser safety can be treated as a null hypothesis and its validity scientifically tested. Such a test using chi-square as the test statistic is presented. It shows that the null hypothesis should be rejected; i.e., model M26 and X26 Tasers are capable of producing lethal effects non-electrically and so have played a causal or contributory role in a great many of the in-custody deaths following their use. This implies that the Taser is a lethal weapon, and that LEOs have not been adequately trained in its safe use!
Walichiewicz, P; Wodniecki, J; Szczurek-Katański, K; Jacheć, W; Nowalany-Kozielska, E; Trzeciak, P; Janik, J
2001-01-01
In this study we tried to check which clinical data are connected with the choice of treatment in patients with a multivessel coronary artery disease. The data of 137 patients with a multivessel coronary artery disease, were analysed retrospectively. The patients were divided into three groups: treated conservatively, CABG and PTCA. Multivessel coronary artery disease was recognised when there were atherosclerotic changes in more a 2 vessels with a not less a 2 mm in diameter. Patients with previous CABG or a left main coronary artery disease were excluded. The data were analysed by means of several methods, variance analysis, correlation analysis, discriminant functions, chi-square test and T-Student test. For treatment decision making in multivessel coronary artery disease of statistical significance were: the state of the left anterior descendent artery below the first diagonal branch, the state of the first diagonal branch and peripheral parts of the left anterior descendent artery and right coronary artery, the systolic function of the antero-lateral, apical and phrenic segments of the left ventricle, the global left ventricular ejection fraction in angiography and echocardiography, local systolic disturbances of the left ventricular observed in echocardiography, the coexistence of symptoms of heart failure as well as unstable angina. Treatment decision making will always depend not only on diagnostic procedures but also on all the clinical data about the patient and the experience of coworking cardiology and surgery centres.
Clarkson, Garry; Gendron, Guillaume; Libération Newspaper
2016-01-01
Libération (the one founded in Paris by Jean-Paul Sartre and Serge July in 1973) visit to North Yorkshire and Leeds concerning the Brexit European Referendum debate. \\ud \\ud Photographs © Garry Clarkson: Leeds Councillors, Dan Cohen and Neil Buckly, Slaughterhouse worker, Steve Hanson and environment around Follifoot village Yorkshire to show quintessential 'middle England'\\ud \\ud http://www.liberation.fr/planete/2016/05/17/brexit-et-au-milieu-coule-l-angleterre_1453266
Directory of Open Access Journals (Sweden)
Peter R. J. North
2013-03-01
Full Text Available Radiative transfer models predicting the bidirectional reflectance factor (BRF of leaf canopies are powerful tools that relate biophysical parameters such as leaf area index (LAI, fractional vegetation cover fV and the fraction of photosynthetically active radiation absorbed by the green parts of the vegetation canopy (fAPAR to remotely sensed reflectance data. One of the most successful approaches to biophysical parameter estimation is the inversion of detailed radiative transfer models through the construction of Look-Up Tables (LUTs. The solution of the inverse problem requires additional information on canopy structure, soil background and leaf properties, and the relationships between these parameters and the measured reflectance data are often nonlinear. The commonly used approach for optimization of a solution is based on minimization of the least squares estimate between model and observations (referred to as cost function or distance; here we will also use the terms “statistical distance” or “divergence” or “metric”, which are common in the statistical literature. This paper investigates how least-squares minimization and alternative distances affect the solution to the inverse problem. The paper provides a comprehensive list of different cost functions from the statistical literature, which can be divided into three major classes: information measures, M-estimates and minimum contrast methods. We found that, for the conditions investigated, Least Square Estimation (LSE is not an optimal statistical distance for the estimation of biophysical parameters. Our results indicate that other statistical distances, such as the two power measures, Hellinger, Pearson chi-squared measure, Arimoto and Koenker–Basset distances result in better estimates of biophysical parameters than LSE; in some cases the parameter estimation was improved by 15%.
International Nuclear Information System (INIS)
Dorr, Kent A.; Ostrom, Michael J.; Freeman-Pollard, Jhivaun R.
2012-01-01
CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built in an accelerated manner with American Recovery and Reinvestment Act (ARRA) funds and has attained Leadership in Energy and Environmental Design (LEED) GOLD certification, which makes it the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. There were many contractual, technical, configuration management, quality, safety, and LEED challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility. This paper will present the Project and LEED accomplishments, as well as Lessons Learned by CHPRC when additional ARRA funds were used to accelerate design, procurement, construction, and commissioning of the 200 West Groundwater Pump and Treatment (2W PandT) Facility to meet DOE's mission of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012
Energy Technology Data Exchange (ETDEWEB)
Dorr, Kent A. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Ostrom, Michael J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Freeman-Pollard, Jhivaun R. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2012-11-14
CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built in an accelerated manner with American Recovery and Reinvestment Act (ARRA) funds and has attained Leadership in Energy and Environmental Design (LEED) GOLD certification, which makes it the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. There were many contractual, technical, configuration management, quality, safety, and LEED challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility. This paper will present the Project and LEED accomplishments, as well as Lessons Learned by CHPRC when additional ARRA funds were used to accelerate design, procurement, construction, and commissioning of the 200 West Groundwater Pump and Treatment (2W P&T) Facility to meet DOE's mission of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012.
Energy Technology Data Exchange (ETDEWEB)
Hoffer, Saskia [Univ. of California, Berkeley, CA (United States)
2002-01-01
Electron based surface probing techniques can provide detailed information about surface structure or chemical composition in vacuum environments. The development of new surface techniques has made possible in situ molecular level studies of solid-gas interfaces and more recently, solid-liquid interfaces. The aim of this dissertation is two-fold. First, by using novel sample preparation, Low Energy Electron Diffraction (LEED) and other traditional ultra high vacuum (UHV) techniques are shown to provide new information on the insulator/vacuum interface. The surface structure of the classic insulator NaCl has been determined using these methods. Second, using sum frequency generation (SFG) surface specific vibrational spectroscopy studies were performed on both the biopolymer/air and electrode/electrolyte interfaces. The surface structure and composition of polyetherurethane-silicone copolymers were determined in air using SFG, atomic force microscopy (AFM), and X-ray photoelectron spectroscopy (XPS). SFG studies of the electrode (platinum, gold and copper)/electrolyte interface were performed as a function of applied potential in an electrochemical cell.
Rollin, Henry R; Reynolds, Edward H
2018-01-01
In the late-eighteenth and nineteenth centuries, a more humane approach to the care of the insane in Britain was catalyzed in part by the illness of King George III. The Reform Movement envisaged "moral" treatment in asylums in pleasant rural environments, but these aspirations were overwhelmed by industrialization, urbanization, and the scale of the need, such that most asylums became gigantic institutions for chronic insanity. Three institutions in Yorkshire remained beacons of enlightenment in the general gloom of Victorian alienism: the Retreat in York founded and developed by the Quaker Tuke family; the West Riding Lunatic Asylum in Wakefield led by Sir James Crichton-Browne, which initiated research into brain and mental diseases; and the Leeds Medical School and Wakefield axis associated with Sir Thomas Clifford Allbutt, which pioneered teaching of mental diseases and, later, the first Chair of Psychiatry. Three other Yorkshiremen who greatly influenced nineteenth-century "neuropsychiatry" in Britain and abroad were Thomas Laycock in York and Edinburgh, and Henry Maudsley and John Hughlings Jackson in London.
Santana, Jamilly C. V.; Santos, Victor S.; Gurgel, Ricardo Q.; Santana, Julianne C. V.; Reis, Francisco P.; Cuevas, Luis E.; Feitosa, Vera L. C.
2016-01-01
Neuropathic pain (NP) often occurs during the course of leprosy, and screening tools to differentiate NP from non-NP are often used. However, their performance varies in different settings. The most frequently used scales are the Douleur Neuropathique in 4 questions (DN4) and the Leeds assessment of neuropathic symptoms and signs (LANSS) questionnaires. Thus, we conducted a study to evaluate the agreement between DN4 and LANSS questionnaires to classify NP in 195 leprosy patients attending two reference centers in Sergipe, Brazil. The DN4 and LANSS classified 166 and 110 patients, respectively, as having NP. One hundred and seven (54.8%) were classified as NP by both questionnaires; 59 (30.2%) solely by the DN4 questionnaire and three (1.5%) solely by the LANSS. The agreement of the questionnaires was 66.2% (weak agreement, Kappa = 0.30). Although both questionnaires identified a high proportion of NP, the development of more robust instruments is necessary to ensure the accuracy of diagnosis of leprosy patients classified as having NP. PMID:27458041
Sherief, Tamer I; Naguib, Ashraf M; Sefton, Graham K
2005-08-01
This study was carried out in order to assess the results of reconstruction of a deficient extensor mechanism in the presence of a total knee replacement (TKR) using the Leeds-Keio Connective Tissue Prosthesis (L-K CTP). The L-K CTP is available as flat tapes constructed from polyester in an open weave structure. It was used to reinforce and reconstruct the extensor mechanism, which was deficient in three patients who had undergone total knee replacement or were about to undergo total knee replacement. Two cases had extensor mechanism deficiency as a complication following total knee replacement while the third case had extensor mechanism deficiency at the time of the primary knee replacement. The average follow-up was 2 years (range of follow up was 12 to 48 months). All three cases showed good results with no extension lag and good range of movement at follow up. The use of L-K CTP for reconstruction of the knee extensor mechanism offers a good option for the management of the uncommon but difficult problem of extensor mechanism deficiency in patients with a total knee replacement.
Directory of Open Access Journals (Sweden)
Jasmin Kientzel
2011-12-01
Full Text Available Voluntary environmental programs (VEPs have become increasingly popular around the world to address energy efficiency issues that mandatory building codes have not been able to tackle. Even though the utility of voluntary schemes is widely debated, they have become a de facto reality for many professionals in the building and construction sector. One topic that is neglected, however, in both academic and policy discussions, relates to how professionals (architects, engineers, real estate developers, etc. perceive the rise of voluntary rating schemes. In order to fill this gap in the literature, the present study investigates beliefs underlying adoption behavior regarding one of the most prominent voluntary assessment and certification programs in the U.S. building industry, the Leadership in Energy and Environmental Design (LEED scheme. In this paper, an elicitation study, based on 14 semi-structured interviews with building professionals in the North East of the United States, was conducted to analyze this question. Building on the Reasoned Action Approach, this paper shows that, in addition to more conventional factors such as financial calculations and marketing aspects, the understanding of beliefs held by building professionals offers important insights into their decisions to work with Voluntary Environmental Assessment and Rating Programs.
Energy Technology Data Exchange (ETDEWEB)
Pena Barrera, Leticia [Universidad Autonoma de Ciudad Juarez, Chihuahua (Mexico)
2009-01-15
This article presents the analysis realized to the urban design of a colony applying the indicators of The Leadership in Energy (LEED-ND). The advantages that represent as far as impact are established and also the limits are pointed out, evaluating their efficiency in the application of indicators to improve performance and energy saving. Based on the analysis applied to the colony under study, some right solutions in the urban design are obtained that should be established as a part of the in force standardization. Nevertheless, the follow up to this same company in other developments, reflects that the proposals are not determined as a strategy of self planning but only to fulfill the asked requirements, obtaining a result with smaller impact and as an index that allows offering residential alternatives in the city tending to the sustained development. [Spanish] Este articulo presenta el analisis realizado al diseno urbano de un fraccionamiento aplicando los indicadores de The Leadership in Energy (LEED-ND). Se establecen las ventajas que presenta en cuanto a impacto y tambien se senalan las limitantes, evaluando su eficiencia en la aplicacion de indicadores para mejorar desempeno y ahorro energetico. Con base en el analisis aplicado al fraccionamiento en estudio, se tienen algunas soluciones acertadas en el diseno urbano que debieran establecerse como parte de la normatividad vigente, sin embargo, el seguimiento a esta misma empresa en otros desarrollos, refleja que las propuestas no estan determinadas como una estrategia de planeacion propia sino unicamente para cumplir con los requerimientos solicitados, obteniendo un resultado con menor impacto y como indice que permitan ofrecer alternativas habitacionales en la ciudad tendientes al desarrollo sostenido.
Barbosa, Fabiano Timbó; de Souza, Diego Agra
2010-01-01
Statistical analysis is necessary for adequate evaluation of the original article by the reader allowing him/her to better visualize and comprehend the results. The objective of the present study was to determine the frequency of the adequate use of statistical tests in original articles published in the Revista Brasileira de Anestesiologia from January 2008 to December 2009. Original articles published in the Revista Brasileira de Anestesiologia between January 2008 and December 2009 were selected. The use of statistical tests was deemed appropriate when the selection of the tests was adequate for continuous and categorical variables and for parametric and non-parametric tests, the correction factor was described when the use of multiple comparisons was reported, and the specific use of a statistical test for analysis of one variable was mentioned. Seventy-six original articles from a total of 179 statistical tests were selected. The frequency of the statistical tests used more often was: Chi-square 20.11%, Student t test 19.55%, ANOVA 10.05%, and Fisher exact test 9.49%. The frequency of the adequate use of statistical tests was 56.42% (95% CI 49.16% to 63.68%), erroneous use in 13.41% (95% CI 8.42% to 18.40%), and an inconclusive result in 30.16% (95% CI 23.44% to 36.88%). The frequency of inadequate use of statistical tests in the articles published by the Revista Brasileira de Anestesiologia between January 2008 and December 2009 was 56.42%. Copyright © 2010 Elsevier Editora Ltda. All rights reserved.
Attuali sviluppi nello studio dell’associazione tra variabili di tipo qualitativo.
Directory of Open Access Journals (Sweden)
C Antonelli
1996-02-01
Full Text Available The most common procedure for analyzing contingency table data is by using chi-square statistic. The early development of chi-square analysis of contingency table is credited to Pearson (1904 and Fisher (1929, successively expanded by Yates, Mantel-Haenszcl etc. In this paper some developments of the chi-square function has been outlined, particularly as statistical test for the null hypothesis of independence, for subdividing contingency tables and the chi-square test for linear association between ordinal variables.
Isomura, Tatsuya; Sumitani, Masahiko; Matsudaira, Ko; Kawaguchi, Mika; Inoue, Reo; Hozumi, Jun; Tanaka, Takeyuki; Oshima, Hirofumi; Mori, Kanto; Taketomi, Shuji; Inui, Hiroshi; Tahara, Keitaro; Yamagami, Ryota; Hayakawa, Kazuhiro
2017-07-01
We aimed to assess the diagnostic utility of the linguistically validated Japanese version of the Leeds Assessment of Neuropathic Symptoms and Signs Pain Scale (LANSS-J) as a screening tool for neuropathic pain in the clinical setting. Patients with neuropathic pain or nociceptive pain who were 20 to 85 years of age were included. Sensitivity and specificity using the original cutoff value of 12 were assessed to evaluate the diagnostic utility of the LANSS-J. Sensitivity and specificity with possible cutoff values were calculated, along with area under the receiver operating characteristic curve. We then evaluated agreement regarding assessment of the LANSS-J by two investigators. We used the intraclass correlation coefficient (ICC) for the total score and Cohen's kappa coefficient for each item. Data for patients with neuropathic pain (n = 30) and those with nociceptive pain (n = 29) were analyzed. With a cutoff of 12, the sensitivity was 63.3% (19/30) and the specificity 93.1% (27/29). Sensitivity improved substantially with a cutoff of ≤ 11 (≥ 83.3%, 25/30). High specificity (93.1%, 27/29) was sustained with a cutoff of 9 to 12. The ICC for the total score was 0.85, indicating sufficient agreement. Kappa coefficients ranged from 0.68 to 0.84. The LANSS-J is a valid screening tool for detecting neuropathic pain. Our results suggest that employing the original cutoff value provides high specificity, although a lower cutoff value of 10 or 11 (with its high specificity maintained) may be more beneficial when pain attributed to neuropathic mechanisms is suspected in Japanese patients. © 2016 World Institute of Pain.
Fairhurst, Caroline; Böhnke, Jan R; Gabe, Rhian; Croudace, Tim J; Tober, Gillian; Raistrick, Duncan
2014-11-01
To examine the relationship between three outcome measures used by a specialist addiction service (UK): the Leeds Dependence Questionnaire (LDQ), the Social Satisfaction Questionnaire (SSQ) and the 10-item Clinical Outcomes in Routine Evaluation (CORE-10). A clinical sample of 715 service user records was extracted from a specialist addiction service (2011) database. The LDQ (dependence), SSQ (social satisfaction) and CORE-10 (psychological distress) were routinely administered at the start of treatment and again between 3 and 12 months post-treatment. A mixed pre/post-treatment dataset of 526 service users was subjected to exploratory factor analysis. Parallel Analysis and the Hull method were used to suggest the most parsimonious factor solution. Exploratory factor analysis with three factors accounted for 66.2% of the total variance but Parallel Analysis supported two factors as sufficient to account for observed correlations among items. In the two-factor solution, LDQ items and nine of the 10 CORE-10 items loaded on the first factor >0.41, and the SSQ items on factor 2 with loadings >0.63. A two dimensional summary appears sufficient and clinically meaningful. Among specialist addiction service users, social satisfaction appears to be a unique construct of addiction and is not the same as variation due to psychological distress or dependence. Our interpretation of the findings is that dependence is best thought of as a specific psychological condition subsumed under the construct psychological distress. © 2014 The Authors. Drug and Alcohol Review published by Wiley Publishing Asia Pty Ltd on behalf of Australasian Professional Society on Alcohol and other Drugs.
Spanos, Konstantinos; Lachanas, Vasileios A; Chan, Philip; Bargiota, Alexandra; Giannoukas, Athanasios D
2015-01-01
One of the diagnostic tools of neuropathetic pain (NP) relies on screening questionnaires including the Leeds Assessment of Neuropathic Symptoms and Signs (LANSS) questionnaire. To apply and validate the LANSS questionnaire in Greek population. To assess any correlation between LANSS score and visual analog pain scales. A prospective instrument validation study of LANSS was conducted in University Hospital of Larissa, on 70 patients (35 NP and 35 nociceptive pain), from April 2015 to June 2015. Visual analog pain scales (VAS-ADL; impact of pain on daily living activities, VAS-INT; pain intensity) were also assessed and correlated with LANSS scale. The mean age of NP and nociceptive pain group was 67.11±10.05 and 39.14±17.07years respectively. The mean LANSS score was 12.84 (±9.27) in initial test, and 12.54 (±9.41) in the retest evaluation. Cronbach's alpha was 0.895 and 0.901 at initial and retest examinations respectively, both values indicating good internal consistency. NP group had significant higher LANSS score than nocipeptive pain group (21.34 [±1.39] vs 4.34 [±4.86], pquestionnaire to distinguish neuropathic and nociceptive pain was 94.29% (95% CI: 80.81-99.13%), while its specificity was 88.57% (95% CI: 73.24-96.73%). A significant correlation was noticed between total LANSS score and VAS-ADL (initial r=0.248; p<0.05 and retest evaluation r=0.288; p<0.05). The LANSS score is a reliable and valuable instrument to assess neuropathic pain in diabetic patients and to differentiate it from nociceptive pain in Greek population. In diabetic patients LANSS score is associated with impact on daily activities and potentially with quality of life. Copyright © 2015 Elsevier Inc. All rights reserved.
Valdés-Stauber, J; Valdés-Stauber, M A
2015-10-01
To draft a clinical profile of mentally ill first-generation Spanish immigrants in Germany treated in a special setting in their native language and to identify possible correlations between time of onset of a mental disorder and migration and also between degree of utilization and clinical as well as care variables. Statistical reanalysis of individual data (n = 100) of a previously published descriptive study with aggregated data corresponding to 15 variables. Correlations are calculated using chi-square as well as Fisher's exact test. Multivariate regression and logistic models were conducted. In addition to the explained variance of the models (R(2)), analyses of residuals as well as post-hoc power analyses (1-β) were performed. A quarter of the sample (26 %) was mentally ill before migration; most of the patients received treatment very late (about 10 years after onset) and became chronically ill. Half of the sample shows a relevant somatic comorbidity and large average lengths of inpatient stays (54 days). In 16 % of treated cases, repatriation had to be organized. The degree of chronicity correlates with mental illness prior to migration. Severe mood disorders and psychoses occur late after having migrated, addictions and neurotic disorders are equally distributed over time. Migration can not be set in a causal relationship with the development of mental disorders, although there is a positive correlation between affective disorders and the duration of the migration status. Chronicity is related to an outbreak of the disease before migration. The sample is relatively homogeneous (one nationality, first generation), but loses epidemiological representativeness (not related to a catchment area). © Georg Thieme Verlag KG Stuttgart · New York.
Bichler, Andrea; Neumaier, Arnold; Hofmann, Thilo
2014-11-01
Microbial contamination of groundwater used for drinking water can affect public health and is of major concern to local water authorities and water suppliers. Potential hazards need to be identified in order to protect raw water resources. We propose a non-parametric data mining technique for exploring the presence of total coliforms (TC) in a groundwater abstraction well and its relationship to readily available, continuous time series of hydrometric monitoring parameters (seven year records of precipitation, river water levels, and groundwater heads). The original monitoring parameters were used to create an extensive generic dataset of explanatory variables by considering different accumulation or averaging periods, as well as temporal offsets of the explanatory variables. A classification tree based on the Chi-Squared Automatic Interaction Detection (CHAID) recursive partitioning algorithm revealed statistically significant relationships between precipitation and the presence of TC in both a production well and a nearby monitoring well. Different secondary explanatory variables were identified for the two wells. Elevated water levels and short-term water table fluctuations in the nearby river were found to be associated with TC in the observation well. The presence of TC in the production well was found to relate to elevated groundwater heads and fluctuations in groundwater levels. The generic variables created proved useful for increasing significance levels. The tree-based model was used to predict the occurrence of TC on the basis of hydrometric variables.
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
Sampling, Probability Models and Statistical Reasoning Statistical ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Statistics for dental researchers: descriptive statistics
Mohammad Reza Baneshi PhD; Amir Reza Ghassemi DDS; Arash Shahravan DDS, MS
2012-01-01
Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data....
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
existence in July. 2006, is mandated, among its functions, to exercise statistical co-ordination between. Ministries,. Departments and other agencies of the. Central government; ... tween the Directorate General of Commercial Intelligence and. Statistics ... in some states do not play a nodal role in the coordination of statistical ...
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
Advanced Institute of. Maths, Stats and Com- puter Science, UoH. Campus, Hyderabad. His research interests include theory and practice of sample surveys .... other agencies of the. Central government; and to exercise statistical audit over the statistical activities to ensure quality and integrity of the statistical products.
International Nuclear Information System (INIS)
Dorr, Kent A.; Freeman-Pollard, Jhivaun R.; Ostrom, Michael J.
2013-01-01
CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built to an accelerated schedule with American Recovery and Reinvestment Act (ARRA) funds. There were many contractual, technical, configuration management, quality, safety, and Leadership in Energy and Environmental Design (LEED) challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility to meet DOE's mission objective of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012. The project team's successful integration of the project's core values and green energy technology throughout design, procurement, construction, and start-up of this complex, first-of-its-kind Bio Process facility resulted in successful achievement of DOE's mission objective, as well as attainment of LEED GOLD certification (Figure 1), which makes this Bio Process facility the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. (authors)
Energy Technology Data Exchange (ETDEWEB)
Dorr, Kent A.; Ostrom, Michael J.; Freeman-Pollard, Jhivaun R.
2013-01-11
CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy’s (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built to an accelerated schedule with American Recovery and Reinvestment Act (ARRA) funds. There were many contractual, technical, configuration management, quality, safety, and Leadership in Energy and Environmental Design (LEED) challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility to meet DOE’s mission objective of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012. The project team’s successful integration of the project’s core values and green energy technology throughout design, procurement, construction, and start-up of this complex, first-of-its-kind Bio Process facility resulted in successful achievement of DOE’s mission objective, as well as attainment of LEED GOLD certification, which makes this Bio Process facility the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award.
Energy Technology Data Exchange (ETDEWEB)
Dorr, Kent A.; Freeman-Pollard, Jhivaun R.; Ostrom, Michael J. [CH2M HILL Plateau Remediation Company, P.O. Box 1600, MSIN R4-41, 99352 (United States)
2013-07-01
CH2M Hill Plateau Remediation Company (CHPRC) designed, constructed, commissioned, and began operation of the largest groundwater pump and treatment facility in the U.S. Department of Energy's (DOE) nationwide complex. This one-of-a-kind groundwater pump and treatment facility, located at the Hanford Nuclear Reservation Site (Hanford Site) in Washington State, was built to an accelerated schedule with American Recovery and Reinvestment Act (ARRA) funds. There were many contractual, technical, configuration management, quality, safety, and Leadership in Energy and Environmental Design (LEED) challenges associated with the design, procurement, construction, and commissioning of this $95 million, 52,000 ft groundwater pump and treatment facility to meet DOE's mission objective of treating contaminated groundwater at the Hanford Site with a new facility by June 28, 2012. The project team's successful integration of the project's core values and green energy technology throughout design, procurement, construction, and start-up of this complex, first-of-its-kind Bio Process facility resulted in successful achievement of DOE's mission objective, as well as attainment of LEED GOLD certification (Figure 1), which makes this Bio Process facility the first non-administrative building in the DOE Office of Environmental Management complex to earn such an award. (authors)
Statistics for dental researchers: descriptive statistics
Directory of Open Access Journals (Sweden)
Mohammad Reza Baneshi PhD
2012-09-01
Full Text Available Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data. In health sciences, the majority of continuous variables follow a normal distribution.skewness and kurtosis are two statistics which help to compare a given distribution with the normal distribution.
Algebraic statistics computational commutative algebra in statistics
Pistone, Giovanni; Wynn, Henry P
2000-01-01
Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2009, a statistical profile of transportation in the 50 states and the District of Col...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
... Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates Over Time Cancer Deaths Per Year 5-Year Survival Rate Infographics Childhood Cancer Statistics – Important Facts Each year, the ...
Padilla, I. Y.; Rivera, V. L.; Macchiavelli, R. E.; Torres Torres, N. I.
2016-12-01
Groundwater systems in karst regions are highly vulnerable to contamination and have an enormous capacity to store and rapidly convey pollutants to potential exposure zones over long periods of time. Contaminants in karst aquifers used for drinking water purposes can, therefore, enter distributions lines and the tap water point of use. This study applies spatial and statistical analytical methods to assess potential correlations between contaminants in a karst groundwater system in northern Puerto Rico and exposure in the tap water. It focuses on chlorinated volatile organic compounds (CVOC) and phthalates because of their ubiquitous presence in the environment and the potential public health impacts. The work integrates historical data collected from regulatory agencies and current field measurements involving groundwater and tap water sampling and analysis. Contaminant distributions and cluster analysis is performed with Geographic Information System technology. Correlations between detection frequencies and contaminants concentration in source groundwater and tap water point of use are assessed using Pearson's Chi Square and T-Test analysis. Although results indicate that correlations are contaminant-specific, detection frequencies are generally higher for total CVOC in groundwater than tap water samples, but greater for phthalates in tap water than groundwater samples. Spatial analysis shows widespread distribution of CVOC and phthalates in both groundwater and tap water, suggesting that contamination comes from multiple sources. Spatial correlation analysis indicates that association between tap water and groundwater contamination depends on the source and type of contaminants, spatial location, and time. Full description of the correlations may, however, need to take into consideration variable anthropogenic interventions.
Was I Right? Testing Observations against Predictions in Mendelian Genetics.
Hursch, Thomas Mercer
1979-01-01
Two statistical tools, the Chi-square and standard error approaches, are compared for use in Mendelian genetics experiments. Although the Chi-square technique is more often used, the standard error approach is to be preferred for both research investigations and student experiments. (BB)
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Pater, Liana; Miclea, Şerban; Izvercian, Monica
2016-06-01
This paper considers the impact of SMEs' annual turnover upon its marketing activities (in terms of marketing responsibility, strategic planning and budgeting). Empirical results and literature reviews unveil that SMEs managers incline to partake in planned and profitable marketing activities, depending on their turnover's level. Thus, using the collected data form 131 Romanian SMEs managers, we have applied the Chi-Square Test in order to validate or invalidate three research assumptions (hypotheses), created starting from the empirical and literature findings.
Richter, Karoline
2014-01-01
Resumo: O presente trabalho se propôs a efetuar uma análise tanto quantitativa como qualitativa do processo licitatório de uma obra em execução de uma IFE que visa à obtenção da certificação ambiental LEED. Para a concretização do trabalho, primeiramente, efetuouse um estudo maior com relação aos sistemas de certificação ambiental existentes no mundo e quais encontram-se mais difundidos em aspectos nacionais. Dentre estas, verificou-se qual certificação possuía critérios que se adaptassem ao...
Directory of Open Access Journals (Sweden)
Vanessa Gomes da Silva
Full Text Available No setor da construção civil, a busca por sustentabilidade ainda é incipiente. Este trabalho tem por objetivos analisar a aplicação da certificação LEED TM a empreendimentos brasileiros, entendendo as restrições de uso em uma realidade diferente da referência original norte-americana, e disseminar os resultados obtidos para dois estudos de caso nacionais. Os resultados obtidos confirmaram que (a a certificação LEED TM não é tarefa fácil e, no Brasil, significa, para vários aspectos, saltar da completa ausência de referência para atender a normas americanas; e (b mesmo em centros avançados da construção civil brasileira, o mercado ainda não está preparado para os "selos verdes" internacionais. Apesar dos obstáculos e limitações existentes, a entrada de certificações no Brasil abre a discussão para assuntos antes nunca abordados. Certificações, iniciativas voluntárias e instrumentos de market pull possuem o importante papel de propulsores da transformação do mercado da construção civil na busca da sustentabilidade. Não se pode perder de vista que estas pressupõem uma base anterior - composta de pesquisa e desenvolvimento e transferência de conhecimento e tecnologia ao mercado - para que possam se desenvolver plenamente. Deve-se evitar a ansiedade pela busca de certificação sem o preparo do mercado. Caso contrário, corre-se o risco de fragilizar o papel transformador das certificações.
Coronary Heart Disease Risk Profile among University of Kwa-Zulu ...
African Journals Online (AJOL)
indirectly 170 estimate VO max). Descriptive statistics which included mode, mean, frequency, percentages and inferential statistics comprising chi-square (p# 0.05) were employed in the statistical analysis. All participants reported no history of ...
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Li, Zhen; Li, Qiong; Shen, Yaqi; Li, Anqin; Li, Haojie; Liang, Lili; Hu, Yao; Hu, Xuemei; Hu, Daoyu
2016-09-01
The aim of this study was to investigate the effect of using low tube voltage, low-concentration contrast media and adaptive statistical iterative reconstruction (ASIR) for reducing the radiation and iodine contrast doses in adrenal and nephrogenic hypertension patients. A total of 148 hypertension patients who were suspected for adrenal lesions or renal artery stenoses were assigned to two groups and. Group A (n=74) underwent a low tube voltage, low molecular weight dextran enhanced multi-detector row spiral CT (MDCT) (80 kVp, 270 mg I/mL contrast agent), and the raw data were reconstructed with standard filtered back projection (FBP) and ASIR at four different levels of blending (20%, 40%, 60% and 80%, respectively). The control group (Group B, n=74) underwent conventional MDCT (120 kVp, 370 mg I/mL contrast agent), and the data were reconstructed with FBP. The CT values, standard deviation (SD), signal-noise-ratio (SNR) and contrast-noise-ratio (CNR) were measured in the renal vessels, normal adrenal tissue, adrenal neoplasms and subcutaneous fat. The volume CT dose index (CTDIvol ) and dose length product (DLP) were recorded, and an effective dose (ED) was obtained. Two-tailed independent t-tests, paired Chi-square tests and Kappa consistency tests were used for statistical analysis of the data. The CTDIvol , DLP and total iodine dose in group A were decreased by 47.8%, 49.0% and 26.07%, respectively, compared to group B (Pcontrast media and 60% ASIR provides similar enhancement and image quality with a reduced radiation dose and contrast iodine dose. © 2016 John Wiley & Sons Ltd.
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
... Home > Learn About Blood > Blood Facts and Statistics Blood Facts and Statistics Facts about blood needs Facts ... about American Red Cross Blood Services Facts about blood needs Every two seconds someone in the U.S. ...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Developments in Statistical Education.
Kapadia, Ramesh
1980-01-01
The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Fisher's Contributions to Statistics
Indian Academy of Sciences (India)
T Krishnan received his. Ph.D. from the Indian. Statistical Institute. He joined the faculty of lSI in. 1965 and has been with the Institute ever since. He is at present a professor in the Applied. Statistics, Surveys and. Computing Division of the Institute. Krishnan's research interests are in. Statistical Pattern. Recognition ...
Indian Academy of Sciences (India)
IAS Admin
Dirac statistics, identical and in- distinguishable particles, Fermi gas. ... They obey. Fermi–Dirac statistics. In contrast, those with integer spin such as photons, mesons, 7Li atoms are called bosons and they obey. Bose–Einstein statistics. .... hypothesis (which later was extended as the third law of thermody- namics) was ...
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India. Resonance – Journal of Science Education. Current Issue : Vol. 23, Issue 2 · Current Issue Volume 23 ...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Casati, Michele
2014-05-01
The assertion that solar activity may play a significant role in the trigger of large volcanic eruptions is, and has been discussed by many geophysicists. Numerous scientific papers have established a possible correlation between these events and the electromagnetic coupling between the Earth and the Sun, but none of them has been able to highlight a possible statistically significant relationship between large volcanic eruptions and any of the series, such as geomagnetic activity, solar wind, sunspots number. In our research, we compare the 148 volcanic eruptions with index VEI4, the major 37 historical volcanic eruptions equal to or greater than index VEI5, recorded from 1610 to 2012 , with its sunspots number. Staring, as the threshold value, a monthly sunspot number of 46 (recorded during the great eruption of Krakatoa VEI6 historical index, August 1883), we note some possible relationships and conduct a statistical test. • Of the historical 31 large volcanic eruptions with index VEI5+, recorded between 1610 and 1955, 29 of these were recorded when the SSN<46. The remaining 2 eruptions were not recorded when the SSN<46, but rather during solar maxima of the solar cycle of the year 1739 and in the solar cycle No. 14 (Shikotsu eruption of 1739 and Ksudach 1907). • Of the historical 8 large volcanic eruptions with index VEI6+, recorded from 1610 to the present, 7 of these were recorded with SSN<46 and more specifically, within the three large solar minima known : Maunder (1645-1710), Dalton (1790-1830) and during the solar minimums occurred between 1880 and 1920. As the only exception, we note the eruption of Pinatubo of June 1991, recorded in the solar maximum of cycle 22. • Of the historical 6 major volcanic eruptions with index VEI5+, recorded after 1955, 5 of these were not recorded during periods of low solar activity, but rather during solar maxima, of the cycles 19,21 and 22. The significant tests, conducted with the chi-square χ ² = 7,782, detect a
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Optimization techniques in statistics
Rustagi, Jagdish S
1994-01-01
Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Fundamental statistical theories
International Nuclear Information System (INIS)
Demopoulos, W.
1976-01-01
Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)
... Healthcare Professionals Clinicians Public Health Officials Veterinarians Prevention History of Plague Resources FAQ Maps and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States ...
Valley Fever (Coccidioidomycosis) Statistics
... mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis Treatment Statistics Healthcare Professionals More Resources Candida auris General Information ...
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Shasha, Dennis
2010-01-01
Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes c...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Principles of medical statistics
National Research Council Canada - National Science Library
Feinstein, Alvan R
2002-01-01
... or limited attention. They are then offered a simple, superﬁcial account of the most common doctrines and applications of statistical theory. The "get-it-over-withquickly" approach has been encouraged and often necessitated by the short time given to statistics in modern biomedical education. The curriculum is supposed to provide fundament...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Statistical Engine Knock Control
DEFF Research Database (Denmark)
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...
Practical statistics simply explained
Langley, Dr Russell A
1971-01-01
For those who need to know statistics but shy away from math, this book teaches how to extract truth and draw valid conclusions from numerical data using logic and the philosophy of statistics rather than complex formulae. Lucid discussion of averages and scatter, investigation design, more. Problems with solutions.
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Enhancing statistical literacy
Droogers, M.J.S.|info:eu-repo/dai/nl/413392252; Drijvers, P.H.M.|info:eu-repo/dai/nl/074302922
2017-01-01
Current secondary school statistics curricula focus on procedural knowledge and pay too little attention to statistical reasoning. As a result, students are not able to apply their knowledge to practice. In addition, education often targets the average student, which may lead to gifted students
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
Bayesian statistical inference
Directory of Open Access Journals (Sweden)
Bruno De Finetti
2017-04-01
Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.
Practical statistics for educators
Ravid, Ruth
2014-01-01
Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Ichimaru, Setsuo
2004-01-01
Plasma physics is an integral part of statistical physics, complete with its own basic theories. Designed as a two-volume set, Statistical Plasma Physics is intended for advanced undergraduate and beginning graduate courses on plasma and statistical physics, and as such, its presentation is self-contained and should be read without difficulty by those with backgrounds in classical mechanics, electricity and magnetism, quantum mechanics, and statistics. Major topics include: plasma phenomena in nature, kinetic equations, plasmas and dielectric media, electromagnetic properties of Vlasov plasmas in thermodynamic equilibria, transient processes, and instabilities. Statistical Plasma Physics, Volume II, treats subjects in the field of condensed plasma physics, with applications to condensed matter physics, atomic physics, nuclear physics, and astrophysics. The aim of this book is to elucidate a number of basic topics in physics of dense plasmas that interface with condensed matter physics, atomic physics, nuclear...
Directory of Open Access Journals (Sweden)
Royce Melanie
2005-05-01
Full Text Available Abstract Background The present commonly used five-year survival rates are not adequate to represent the statistical cure. In the present study, we established the minimum number of years required for follow-up to estimate statistical cure rate, by using a lognormal distribution of the survival time of those who died of their cancer. We introduced the term, threshold year, the follow-up time for patients dying from the specific cancer covers most of the survival data, leaving less than 2.25% uncovered. This is close enough to cure from that specific cancer. Methods Data from the Surveillance, Epidemiology and End Results (SEER database were tested if the survival times of cancer patients who died of their disease followed the lognormal distribution using a minimum chi-square method. Patients diagnosed from 1973–1992 in the registries of Connecticut and Detroit were chosen so that a maximum of 27 years was allowed for follow-up to 1999. A total of 49 specific organ sites were tested. The parameters of those lognormal distributions were found for each cancer site. The cancer-specific survival rates at the threshold years were compared with the longest available Kaplan-Meier survival estimates. Results The characteristics of the cancer-specific survival times of cancer patients who died of their disease from 42 cancer sites out of 49 sites were verified to follow different lognormal distributions. The threshold years validated for statistical cure varied for different cancer sites, from 2.6 years for pancreas cancer to 25.2 years for cancer of salivary gland. At the threshold year, the statistical cure rates estimated for 40 cancer sites were found to match the actuarial long-term survival rates estimated by the Kaplan-Meier method within six percentage points. For two cancer sites: breast and thyroid, the threshold years were so long that the cancer-specific survival rates could yet not be obtained because the SEER data do not provide
International Nuclear Information System (INIS)
Zhixiang, Z.
1983-01-01
The least squares fit has been performed using chi-squared distribution function for all available evaluated data for s-wave reduced neutron width of several nuclei. The number of degrees of freedom and average value have been obtained. The missing levels of weak s-wave resonances and extra p-wave levels have been taken into account, if any. For 75 As and 103 Rh, s-wave population has been separated by Bayes' theorem before making fit. The results thus obtained are consistent with Porter-Thomas distribution, i.e., chi-squared distribution with γ=1, as one would expect. It has not been found in this work that the number of degrees of freedom for the distribution of s-wave reduced neutron width might be greater than one as reported by H.C.Sharma et al. (1976) at the international conference on interactions of neutrons with nuclei. (Auth.)
International Nuclear Information System (INIS)
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Statistics a complete introduction
Graham, Alan
2013-01-01
Statistics: A Complete Introduction is the most comprehensive yet easy-to-use introduction to using Statistics. Written by a leading expert, this book will help you if you are studying for an important exam or essay, or if you simply want to improve your knowledge. The book covers all the key areas of Statistics including graphs, data interpretation, spreadsheets, regression, correlation and probability. Everything you will need is here in this one book. Each chapter includes not only an explanation of the knowledge and skills you need, but also worked examples and test questions.
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Statistical physics; Physique statistique
Energy Technology Data Exchange (ETDEWEB)
Couture, L.; Zitoun, R. [Universite Pierre et Marie Curie, 75 - Paris (France)
1992-12-31
The basis of statistical physics is exposed. The statistical models of Maxwell-Boltzmann, of Bose-Einstein and of Fermi-Dirac and their particular application fields are presented. The statistical theory is applied in different ranges of physics: gas characteristics, paramagnetism, crystal thermal properties and solid electronic properties. A whole chapter is dedicated to helium and its characteristics such as superfluidity, another deals with superconductivity. Superconductivity is presented both experimentally and theoretically. Meissner effect and Josephson effect are described and the framework of BCS theory is drawn. (A.C.)
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
Mauro, John
2013-01-01
Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g
Methods of statistical physics
Akhiezer, Aleksandr I
1981-01-01
Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
School Violence: Data & Statistics
... Programs Press Room Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first step in preventing school violence is to understand the extent and nature ...
Medicaid Drug Claims Statistics
U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Illinois travel statistics, 2008
2009-01-01
The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2010
2011-01-01
The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2009
2010-01-01
The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
CSIR Research Space (South Africa)
Shepperson, L
1997-12-01
Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...
Information theory and statistics
Kullback, Solomon
1968-01-01
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.
Statistical Measures of Marksmanship
National Research Council Canada - National Science Library
Johnson, Richard
2001-01-01
.... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Statistics For Neuroscientists
Directory of Open Access Journals (Sweden)
Subbakrishna D.K
2000-01-01
Full Text Available The role statistical methods play in medicine in the interpretation of empirical data is well recognized by researchers. With modern computing facilities and software packages there is little need for familiarity with the computational details of statistical calculations. However, for the researcher to understand whether these calculations are valid and appropriate it is necessary that the user is aware of the rudiments of the statistical methodology. Also, it needs to be emphasized that no amount of advanced analysis can be a substitute for a properly planned and executed study. An attempt is made in this communication to discuss some of the theoretical issues that are important for the valid analysis and interpretation of precious date that are gathered. The article summarises some of the basic statistical concepts followed by illustrations from live data generated from various research projects from the department of Neurology of this Institute.
Elements of statistical thermodynamics
Nash, Leonard K
2006-01-01
Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.
Ehrlichiosis: Statistics and Epidemiology
... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On This ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...
Anaplasmosis: Statistics and Epidemiology
... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On This ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Michael, A. J.
2012-12-01
Detecting trends in the rate of sporadic events is a problem for earthquakes and other natural hazards such as storms, floods, or landslides. I use synthetic events to judge the tests used to address this problem in seismology and consider their application to other hazards. Recent papers have analyzed the record of magnitude ≥7 earthquakes since 1900 and concluded that the events are consistent with a constant rate Poisson process plus localized aftershocks (Michael, GRL, 2011; Shearer and Stark, PNAS, 2012; Daub et al., GRL, 2012; Parsons and Geist, BSSA, 2012). Each paper removed localized aftershocks and then used a different suite of statistical tests to test the null hypothesis that the remaining data could be drawn from a constant rate Poisson process. The methods include KS tests between event times or inter-event times and predictions from a Poisson process, the autocorrelation function on inter-event times, and two tests on the number of events in time bins: the Poisson dispersion test and the multinomial chi-square test. The range of statistical tests gives us confidence in the conclusions; which are robust with respect to the choice of tests and parameters. But which tests are optimal and how sensitive are they to deviations from the null hypothesis? The latter point was raised by Dimer (arXiv, 2012), who suggested that the lack of consideration of Type 2 errors prevents these papers from being able to place limits on the degree of clustering and rate changes that could be present in the global seismogenic process. I produce synthetic sets of events that deviate from a constant rate Poisson process using a variety of statistical simulation methods including Gamma distributed inter-event times and random walks. The sets of synthetic events are examined with the statistical tests described above. Preliminary results suggest that with 100 to 1000 events, a data set that does not reject the Poisson null hypothesis could have a variability that is 30% to
Statistical Engine Knock Control
DEFF Research Database (Denmark)
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....
Business statistics I essentials
Clark, Louise
2014-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t
Statistical mechanics rigorous results
Ruelle, David
1999-01-01
This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
Goldstein, Harvey
2011-01-01
This book provides a clear introduction to this important area of statistics. The author provides a wide of coverage of different kinds of multilevel models, and how to interpret different statistical methodologies and algorithms applied to such models. This 4th edition reflects the growth and interest in this area and is updated to include new chapters on multilevel models with mixed response types, smoothing and multilevel data, models with correlated random effects and modeling with variance.
European environmental statistics handbook
Energy Technology Data Exchange (ETDEWEB)
Newman, O.; Foster, A. [comps.] [Manchester Business School, Manchester (United Kingdom). Library and Information Service
1993-12-31
This book is a compilation of statistical materials on environmental pollution drawn from governmental and private sources. It is divided into ten chapters: air, water and land - monitoring statistics; cities, regions and nations; costs, budgets and expenditures - costs of pollution and its control, including air pollution; effects; general industry and government data; laws and regulations; politics and opinion - including media coverage; pollutants and wastes; pollution control industry; and tools, methods and solutions. 750 tabs.
Johnson, Norman
This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...
UN Data- Environmental Statistics: Waste
World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...
UN Data: Environment Statistics: Waste
World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...
International Nuclear Information System (INIS)
Imbo, T.D.; March-Russel, J.
1990-01-01
We investigate the allowed spectrum of statistics for n identical spinless particles on an arbitrary closed two-manifold M, by using a powerful topological approach to the study of quantum kinematics. On a surface of genus g≥1 statistics other than Bose or Fermi can only be obtained by utilizing multi-component state vectors transforming as an irreducible unitary representation of the fundamental group of the n-particle configuration space. These multi-component (or nonscalar) quantizations allow the possibility of fractional statistics, as well as other exotic, nonfractional statistics some of whose properties we discuss. On an orientable surface of genus g≥0 only anyons with rational statistical parameter θ/π=p/q are allowed, and their number is restricted to be sq-g+1 (selement ofZ). For nonorientable surfaces only θ=0, π are allowed. Finally, we briefly comment on systems of spinning particles and make a comparison with the results for solitons in the O(3)-invariant nonlinear sigma model with space manifold M. (orig.)
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
1992 Energy statistics Yearbook
International Nuclear Information System (INIS)
1994-01-01
The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from annual questionnaires distributed by the United Nations Statistical Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistical Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Forster, Malcolm R
2011-01-01
Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.
Statistics and social critique
Directory of Open Access Journals (Sweden)
Alain Desrosières
2014-07-01
Full Text Available This paper focuses on the history of the uses of statistics as a tool for socialcritique. Whereas nowadays they are very often conceived as being in the hands of the powerful, there are many historical cases when they were, on the contrary, used to oppose the authority. The author first illustrates the theory of Ted Porter according to which quantification might be a “tool of weakness”. He then addresses the fact that statistics were used in the context of labour and on living conditions, thus being a resource for the lower class of society (and presenting the theory of statistics of Pelloutier, an anarchist activist. Finally comes the question of the conditions of success of these counterpropositions, discussed on the examples of the new random experiments in public policies, and of the measure of the 1% of the richest persons
Per Object statistical analysis
DEFF Research Database (Denmark)
2008-01-01
This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Object...... an analysis of the values of the object's pixels in MS-Excel. The shell of the proceedure could also be used for purposes other than just the derivation of Object - Sub-object statistics, e.g. rule-based assigment processes....... Variable. This procedure was developed in order to be able to export objects as ESRI shape data with the 90-percentile of the Hue of each object's pixels as an item in the shape attribute table. This procedure uses a sub-level single pixel chessboard segmentation, loops for each of the objects...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical......Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
Energy Technology Data Exchange (ETDEWEB)
NONE
2010-07-01
Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.
Statistical inferences in phylogeography
DEFF Research Database (Denmark)
Nielsen, Rasmus; Beaumont, Mark A
2009-01-01
can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....
Multivariate Statistical Process Control
DEFF Research Database (Denmark)
Kulahci, Murat
2013-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... in conjunction with image data are plagued with various challenges beyond the usual ones encountered in current applications. In this presentation we will introduce the basic ideas of SPC and the multivariate control charts commonly used in industry. We will further discuss the challenges the practitioners...
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
READING STATISTICS AND RESEARCH
Directory of Open Access Journals (Sweden)
Reviewed by Yavuz Akbulut
2008-10-01
Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.
Diffeomorphic Statistical Deformation Models
DEFF Research Database (Denmark)
Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus
2007-01-01
In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...
Statistics As Principled Argument
Abelson, Robert P
2012-01-01
In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative
Energy Technology Data Exchange (ETDEWEB)
Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)
1999-08-01
This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Waller, Derek L
2008-01-01
Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and
Environmental accounting and statistics
International Nuclear Information System (INIS)
Bartelmus, P.L.P.
1992-01-01
The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs
Applied nonparametric statistical methods
Sprent, Peter
2007-01-01
While preserving the clear, accessible style of previous editions, Applied Nonparametric Statistical Methods, Fourth Edition reflects the latest developments in computer-intensive methods that deal with intractable analytical problems and unwieldy data sets. Reorganized and with additional material, this edition begins with a brief summary of some relevant general statistical concepts and an introduction to basic ideas of nonparametric or distribution-free methods. Designed experiments, including those with factorial treatment structures, are now the focus of an entire chapter. The text also e
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
Computational statistical mechanics
Hoover, WG
1991-01-01
Computational Statistical Mechanics describes the use of fast computers to simulate the equilibrium and nonequilibrium properties of gases, liquids, and solids at, and away from equilibrium. The underlying theory is developed from basic principles and illustrated by applying it to the simplest possible examples. Thermodynamics, based on the ideal gas thermometer, is related to Gibb's statistical mechanics through the use of Nosé-Hoover heat reservoirs. These reservoirs use integral feedback to control temperature. The same approach is carried through to the simulation and anal
International Nuclear Information System (INIS)
Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.
1999-08-01
This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs
Bernstein, Ira; Samuels, Ellery; Woo, Ada; Hagge, Sarah L
2013-01-01
The National Council Licensure Examination (NCLEX) program has evaluated differential item functioning (DIF) using the Mantel-Haenszel (M-H) chi-square statistic. Since a Rasch model is assumed, DIF implies a difference in item difficulty between a reference group, e.g., White applicants, and a focal group, e.g., African-American applicants. The National Council of State Boards of Nursing (NCSBN) is planning to change the statistic used to evaluate DIF on the NCLEX from M-H to the separate calibration t-test (t). In actuality, M-H and t should yield identical results in large samples if the assumptions of the Rasch model hold (Linacre and Wright, 1989, also see Smith, 1996). However, as is true throughout statistics, "how large is large" is undefined, so it is quite possible that systematic differences exist in relatively smaller samples. This paper compares M-H and t in four sets of computer simulations. Three simulations used a ten-item test with nine fair items and one potentially containing DIF. To address instability that may result from a ten-item test, the fourth used a 30-item test with 29 fair items and one potentially containing DIF. Depending upon the simulation, the magnitude of population DIF (0, .5, 1.0, and 1.5 z-score units), the ability difference between the focal and reference group (-1, 0, and 1 z-score units), the focal group size (0, 10, 20, 40, 50, 80, 160, and 1000), and the reference group size (500 and 1000) were varied. The results were that: (a) differences in estimated DIF between the M-H and t statistics are generally small, (b) t tends to estimate lower chance probabilities than M-H with small sample sizes, (c) neither method is likely to detect DIF, especially when it is of slight magnitude in small focal group sizes, and (d) M-H does marginally better than t at detecting DIF but this improvement is also limited to very small focal group sizes.
DEFF Research Database (Denmark)
Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose
2003-01-01
. A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...
Statistical learning and prejudice.
Madison, Guy; Ullén, Fredrik
2012-12-01
Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.
Beyond quantum microcanonical statistics
International Nuclear Information System (INIS)
Fresch, Barbara; Moro, Giorgio J.
2011-01-01
Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.
Swiss electricity statistics 1982
International Nuclear Information System (INIS)
1983-01-01
The Swiss Department of Energy has published electricity statistics for 1982. This report presents them in tabular form. The tables are classified under the following headings: important reference numbers, Swiss electricity review, production of electrical energy, use of electrical energy, load diagrams and coping with user requirements, import and export of energy 1982, possible building of power stations before 1989, finance, appendix
Simple Statistics: - Summarized!
Blai, Boris, Jr.
Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…
Minnesota forest statistics, 1990.
Patrick D. Miles; Chung M. Chen
1992-01-01
The fifth inventory of Minnesota's forests reports 51.0 million acres of land, of which 16.7 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
Kleibergen, F.R.; Kleijn, R.; Paap, R.
2000-01-01
We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike
African Journals Online (AJOL)
perverse incentive' to under record violent crime, particularly the various forms of assault.6 In effect this has rendered the SAPS statistics for inter-personal. * Gould and Burger are senior researchers in the Crime and Justice Programme of the ISS ...
International Monetary Fund
1995-01-01
This paper provides statistical data of macroeconomic flows, national accounts, production, and employment, combined public sector, financial sector, and external sector. They are listed as follows: gross domestic product by expenditure, mining reserves and production, investment in petroleum exploration, consumer prices, public sector employment, operations of the central government, monetary surveys, selected interest rates, open market bills, balance of payments, exports by principal produ...
W. Brad Smith; Mark F. Golitz
1988-01-01
The third inventory of Indiana's timber resource shows that timberland area in Indiana climbed from 3.9 to 4.3 million acres between 1967 and 1986, an increase of more than 10%. During the same period growing-stock volume increased 43%. Highlights and statistics are presented on area, volume, growth, mortality, and removals.
International Nuclear Information System (INIS)
Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.
1978-01-01
The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 1. Fermi–Dirac Statistics. Subhash Chaturvedi Shyamal Biswas. General ... School of Physics University of Hyderabad C R Rao Road, Gachibowli Hyderabad 500 046, India. University of Hyderabad C R Rao Road, Gachibowli Hyderabad 500 ...
International Nuclear Information System (INIS)
Banerjee, Rabin; Majhi, Bibhas Ranjan
2010-01-01
Starting from the definition of entropy used in statistical mechanics we show that it is proportional to the gravity action. For a stationary black hole this entropy is expressed as S=E/2T, where T is the Hawking temperature and E is shown to be the Komar energy. This relation is also compatible with the generalized Smarr formula for mass.
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Mosteller, Frederick; Hoaglin, David C; Tanur, Judith M
2010-01-01
Includes chapter-length insider accounts of work on the pre-election polls of 1948, statistical aspects of the Kinsey report on sexual behavior in the human male, mathematical learning theory, authorship of the disputed Federalist papers, safety of anesthetics, and an examination of the Coleman report on equality of educational opportunity
Elementary statistical physics
Kittel, C
1965-01-01
This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.
Statistics for Learning Genetics
Charles, Abigail Sheena
2012-01-01
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…
Fisher's Contributions to Statistics
Indian Academy of Sciences (India)
research workers of various disciplines in designing their studies and in analysing data thereof. He is also called upon to advise organisations like the ... such visual aids. It is believed that this situation helped him develop a keen geometrical sense. Fisher's contributions to statistics have also given rise to a number of bitter ...
Topics in Statistical Calibration
2014-03-27
type of garden cress called nasturtium. The response is weight of the plant in milligrams (mg) after three weeks of growth, and the predictor is the...7 (1):1–26, 1979. B. Efron. The bootstrap and markov-chain monte carlo. Journal of Biopharmaceutical Statistics, 21(6):1052–1062, 2011. B. Efron and
Air Carrier Traffic Statistics.
2013-11-01
This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...
Statistically Valid Planting Trials
C. B. Briscoe
1961-01-01
More than 100 million tree seedlings are planted each year in Latin America, and at least ten time'that many should be planted Rational control and development of a program of such magnitude require establishing and interpreting carefully planned trial plantings which will yield statistically valid answers to real and important questions. Unfortunately, many...
Geometric statistical inference
International Nuclear Information System (INIS)
Periwal, Vipul
1999-01-01
A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2001-01-01
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics.
Whither Statistics Education Research?
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
Statistical Hadronization and Holography
DEFF Research Database (Denmark)
Bechi, Jacopo
2009-01-01
In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal, a...
ISSN 2073-9990 East Cent. Afr. J. surg. (Online)
African Journals Online (AJOL)
Hp 630 Dual Core
Duodenal. Perforations .... 'ˆ …' '”„‹†‹–›. Keywords: gastro duodenal perforations, perforated peptic ulcer disease, gastric perforations ... predetermined outcomes using Chi square and fisherss exact test were used to detect statistical significance ...
Software Used to Generate Cancer Statistics - SEER Cancer Statistics
Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.
Statistical theory and inference
Olive, David J
2014-01-01
This text is for a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.
Computer intensive statistical methods
Yakowitz, S.
The special session “Computer-Intensive Statistical Methods” was held in morning and afternoon parts at the 1985 AGU Fall Meeting in San Francisco. Calif. Its mission was to provide a forum for hydrologists and statisticians who are active in bringing unconventional, algorithmic-oriented statistical techniques to bear on problems of hydrology. Statistician Emanuel Parzen (Texas A&M University, College Station, Tex.) opened the session by relating recent developments in quantile estimation methods and showing how properties of such methods can be used to advantage to categorize runoff data previously analyzed by I. Rodriguez-Iturbe (Universidad Simon Bolivar, Caracas, Venezuela). Statistician Eugene Schuster (University of Texas, El Paso) discussed recent developments in nonparametric density estimation which enlarge the framework for convenient incorporation of prior and ancillary information. These extensions were motivated by peak annual flow analysis. Mathematician D. Myers (University of Arizona, Tucson) gave a brief overview of “kriging” and outlined some recently developed methodology.
Tighe, Brian
2009-03-01
We study the statistics of contact forces in the force network ensemble, a minimal model of jammed granular media that emphasizes the role of vector force balance. We show that the force probability distribution can be calculated analytically by way of an analogy to equilibrium ensemble methods. In two dimensions the large force tail decays asymptotically as a Gaussian, distinct from earlier predictions, due to the existence of a conserved quantity related to the presence of local vector force balance. We confirm our predictions with highly accurate statistical sampling -- we sample the force distribution over more than 40 decades -- permitting unambiguous confrontation of theory with numerics. We show how the conserved quantity arises naturally within the context of any constant stress ensemble.
Classical and statistical thermodynamics
Rizk, Hanna A
2016-01-01
This is a text book of thermodynamics for the student who seeks thorough training in science or engineering. Systematic and thorough treatment of the fundamental principles rather than presenting the large mass of facts has been stressed. The book includes some of the historical and humanistic background of thermodynamics, but without affecting the continuity of the analytical treatment. For a clearer and more profound understanding of thermodynamics this book is highly recommended. In this respect, the author believes that a sound grounding in classical thermodynamics is an essential prerequisite for the understanding of statistical thermodynamics. Such a book comprising the two wide branches of thermodynamics is in fact unprecedented. Being a written work dealing systematically with the two main branches of thermodynamics, namely classical thermodynamics and statistical thermodynamics, together with some important indexes under only one cover, this treatise is so eminently useful.
General and Statistical Thermodynamics
Tahir-Kheli, Raza
2012-01-01
This textbook explains completely the general and statistical thermodynamics. It begins with an introductory statistical mechanics course, deriving all the important formulae meticulously and explicitly, without mathematical short cuts. The main part of the book deals with the careful discussion of the concepts and laws of thermodynamics, van der Waals, Kelvin and Claudius theories, ideal and real gases, thermodynamic potentials, phonons and all the related aspects. To elucidate the concepts introduced and to provide practical problem solving support, numerous carefully worked examples are of great value for students. The text is clearly written and punctuated with many interesting anecdotes. This book is written as main textbook for upper undergraduate students attending a course on thermodynamics.
Applied statistical thermodynamics
Lucas, Klaus
1991-01-01
The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.
Genton, Marc G.
2015-04-14
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.
Fermions from classical statistics
International Nuclear Information System (INIS)
Wetterich, C.
2010-01-01
We describe fermions in terms of a classical statistical ensemble. The states τ of this ensemble are characterized by a sequence of values one or zero or a corresponding set of two-level observables. Every classical probability distribution can be associated to a quantum state for fermions. If the time evolution of the classical probabilities p τ amounts to a rotation of the wave function q τ (t)=±√(p τ (t)), we infer the unitary time evolution of a quantum system of fermions according to a Schroedinger equation. We establish how such classical statistical ensembles can be mapped to Grassmann functional integrals. Quantum field theories for fermions arise for a suitable time evolution of classical probabilities for generalized Ising models.
1979 DOE statistical symposium
International Nuclear Information System (INIS)
Gardiner, D.A.; Truett, T.
1980-09-01
The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation
1979 DOE statistical symposium
Energy Technology Data Exchange (ETDEWEB)
Gardiner, D.A.; Truett T. (comps. and eds.)
1980-09-01
The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.
Asymptotics in Quantum Statistics
Gill, Richard D.
2004-01-01
Observations or measurements taken of a quantum system (a small number of fundamental particles) are inherently random. If the state of the system depends on unknown parameters, then the distribution of the outcome depends on these parameters too, and statistical inference problems result. Often one has a choice of what measurement to take, corresponding to different experimental set-ups or settings of measurement apparatus. This leads to a design problem--which measurement is best for a give...
Information in statistical physics
Balian, R.
2005-01-01
We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its o...
International Nuclear Information System (INIS)
2003-01-01
This report has 12 chapters. The first chapter includes world energy reserves, the second chapter is about world primary energy production and consumption condition. Other chapters include; world energy prices, energy reserves in Turkey, Turkey primary energy production and consumption condition, Turkey energy balance tables, Turkey primary energy reserves production, consumption, imports and exports conditions, sectoral energy consumptions, Turkey secondary electricity plants, Turkey energy investments, Turkey energy prices.This report gives world and Turkey statistics on energy
Milewski, Emil G
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.
READING STATISTICS AND RESEARCH
Reviewed by Yavuz Akbulut
2008-01-01
The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...
Statistical and theoretical research
International Nuclear Information System (INIS)
Anon.
1983-01-01
Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology
Finkelstein, Michael O
2015-01-01
This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...
Testing statistical hypotheses
Lehmann, E L
2005-01-01
The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...
National Center for Health Statistics
... Submit Search the CDC National Center for Health Statistics Note: Javascript is disabled or is not supported ... Survey of Family Growth Vital Records National Vital Statistics System National Death Index Vital Statistics Rapid Release ...
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Statistical Inference at Work: Statistical Process Control as an Example
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Who Needs Statistics? | Poster
You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.
Statistical mechanics of learning
Engel, Andreas
2001-01-01
The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.
Nanotechnology and statistical inference
Vesely, Sara; Vesely, Leonardo; Vesely, Alessandro
2017-08-01
We discuss some problems that arise when applying statistical inference to data with the aim of disclosing new func-tionalities. A predictive model analyzes the data taken from experiments on a specific material to assess the likelihood that another product, with similar structure and properties, will exhibit the same functionality. It doesn't have much predictive power if vari-ability occurs as a consequence of a specific, non-linear behavior. We exemplify our discussion on some experiments with biased dice.
Statistics in biomedical research
Directory of Open Access Journals (Sweden)
González-Manteiga, Wenceslao
2007-06-01
Full Text Available The discipline of biostatistics is nowadays a fundamental scientific component of biomedical, public health and health services research. Traditional and emerging areas of application include clinical trials research, observational studies, physiology, imaging, and genomics. The present article reviews the current situation of biostatistics, considering the statistical methods traditionally used in biomedical research, as well as the ongoing development of new methods in response to the new problems arising in medicine. Clearly, the successful application of statistics in biomedical research requires appropriate training of biostatisticians. This training should aim to give due consideration to emerging new areas of statistics, while at the same time retaining full coverage of the fundamentals of statistical theory and methodology. In addition, it is important that students of biostatistics receive formal training in relevant biomedical disciplines, such as epidemiology, clinical trials, molecular biology, genetics, and neuroscience.La Bioestadística es hoy en día una componente científica fundamental de la investigación en Biomedicina, salud pública y servicios de salud. Las áreas tradicionales y emergentes de aplicación incluyen ensayos clínicos, estudios observacionales, fisología, imágenes, y genómica. Este artículo repasa la situación actual de la Bioestadística, considerando los métodos estadísticos usados tradicionalmente en investigación biomédica, así como los recientes desarrollos de nuevos métodos, para dar respuesta a los nuevos problemas que surgen en Medicina. Obviamente, la aplicación fructífera de la estadística en investigación biomédica exige una formación adecuada de los bioestadísticos, formación que debería tener en cuenta las áreas emergentes en estadística, cubriendo al mismo tiempo los fundamentos de la teoría estadística y su metodología. Es importante, además, que los estudiantes de
Monin, A S
2007-01-01
""If ever a field needed a definitive book, it is the study of turbulence; if ever a book on turbulence could be called definitive, it is this book."" - ScienceWritten by two of Russia's most eminent and productive scientists in turbulence, oceanography, and atmospheric physics, this two-volume survey is renowned for its clarity as well as its comprehensive treatment. The first volume begins with an outline of laminar and turbulent flow. The remainder of the book treats a variety of aspects of turbulence: its statistical and Lagrangian descriptions, shear flows near surfaces and free turbulenc
Energy Technology Data Exchange (ETDEWEB)
NONE
2010-07-01
The IEA produced its first handy, pocket-sized summary of key energy data in 1997. This new edition responds to the enormously positive reaction to the book since then. Key World Energy Statistics produced by the IEA contains timely, clearly-presented data on supply, transformation and consumption of all major energy sources. The interested businessman, journalist or student will have at his or her fingertips the annual Canadian production of coal, the electricity consumption in Thailand, the price of diesel oil in Spain and thousands of other useful energy facts. It exists in different formats to suit our readers' requirements.
Nonparametric statistical inference
Gibbons, Jean Dickinson
2014-01-01
Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.
Functional and Operatorial Statistics
Dabo-niang, Sophie
2008-01-01
An increasing number of statistical problems and methods involve infinite-dimensional aspects. This is due to the progress of technologies which allow us to store more and more information while modern instruments are able to collect data much more effectively due to their increasingly sophisticated design. This evolution directly concerns statisticians, who have to propose new methodologies while taking into account such high-dimensional data (e.g. continuous processes, functional data, etc.). The numerous applications (micro-arrays, paleo- ecological data, radar waveforms, spectrometric curv
Statistics of extragalactic supernovae
International Nuclear Information System (INIS)
Maza, J.; van den Bergh, S.
1976-01-01
It is shown that supernovae of Type II are concentrated in spiral arms whereas those of Type I show no preference for spiral-arm regions. Rediscussion of available supernova statistics suggests that Tammann may have overestimated the dependence of supernova frequency on galaxy inclination. A study of the distribution of supernovae in elliptical galaxies indicates that the supernova rate per unit luminosity may be highest among (metal-poor) stars in the halos of E galaxies. All galaxies in which supernovae are known to have occurred have been classified on the DDO system
On quantum statistical inference
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...
International petroleum statistics report
Energy Technology Data Exchange (ETDEWEB)
NONE
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
[Comment on] Statistical discrimination
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
Statistical clumped isotope signatures
Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.
2016-01-01
High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168
Statistical methods for forecasting
Abraham, Bovas
2009-01-01
The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...
International Nuclear Information System (INIS)
2004-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2003. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2003 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
International Nuclear Information System (INIS)
Swiss Federal Office of Energy, Berne
2003-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2002. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2002 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
International petroleum statistics report
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Experimental Mathematics and Computational Statistics
Energy Technology Data Exchange (ETDEWEB)
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)
International Nuclear Information System (INIS)
2003-01-01
This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas
Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)
International Nuclear Information System (INIS)
2002-01-01
This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas
Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)
International Nuclear Information System (INIS)
2004-01-01
This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas
Petroleum 2006. Statistical elements
International Nuclear Information System (INIS)
2007-06-01
This document gathers in 5 parts, the main existing statistical data about petroleum industry in France and in the rest of the world, together with an insight on other energy sources: 1 - petroleum in the French economy (petroleum and other energies, petroleum and transports, petroleum and energy in the industry, the residential and tertiary sectors, environment: 42 pages); 2 - the French petroleum industry (exploration, production, foreign trade, transports, refining, storage, petrochemistry: 66 pages); 3 - the French market of petroleum products (evolution of sales by product and detail by region for the past year: 38 pages); 4 - prices and taxes of petroleum products (world prices and rates for crude and refined products, evolution of fret rates, retail prices and French taxes: 28 pages); 5 - petroleum in the world (world energy production and consumption, detailed petroleum activity by main areas and for the main countries: 112 pages). (J.S.)
Davison, Anthony C.
2015-04-10
Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event of interest may be very limited, efficient methods of inference play an important role. This article reviews this domain, emphasizing current research topics. We first sketch the classical theory of extremes for maxima and threshold exceedances of stationary series. We then review multivariate theory, distinguishing asymptotic independence and dependence models, followed by a description of models for spatial and spatiotemporal extreme events. Finally, we discuss inference and describe two applications. Animations illustrate some of the main ideas. © 2015 by Annual Reviews. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Gouvea, Andre de; Murayama, Hitoshi
2003-10-30
'Anarchy' is the hypothesis that there is no fundamental distinction among the three flavors of neutrinos. It describes the mixing angles as random variables, drawn from well-defined probability distributions dictated by the group Haar measure. We perform a Kolmogorov-Smirnov (KS) statistical test to verify whether anarchy is consistent with all neutrino data, including the new result presented by KamLAND. We find a KS probability for Nature's choice of mixing angles equal to 64%, quite consistent with the anarchical hypothesis. In turn, assuming that anarchy is indeed correct, we compute lower bounds on vertical bar U{sub e3} vertical bar{sup 2}, the remaining unknown 'angle' of the leptonic mixing matrix.
de Gouvêa, André; Murayama, Hitoshi
2003-10-01
“Anarchy” is the hypothesis that there is no fundamental distinction among the three flavors of neutrinos. It describes the mixing angles as random variables, drawn from well-defined probability distributions dictated by the group Haar measure. We perform a Kolmogorov-Smirnov (KS) statistical test to verify whether anarchy is consistent with all neutrino data, including the new result presented by KamLAND. We find a KS probability for Nature's choice of mixing angles equal to 64%, quite consistent with the anarchical hypothesis. In turn, assuming that anarchy is indeed correct, we compute lower bounds on |Ue3|2, the remaining unknown “angle” of the leptonic mixing matrix.
International Nuclear Information System (INIS)
Gouvea, Andre de; Murayama, Hitoshi
2003-01-01
'Anarchy' is the hypothesis that there is no fundamental distinction among the three flavors of neutrinos. It describes the mixing angles as random variables, drawn from well-defined probability distributions dictated by the group Haar measure. We perform a Kolmogorov-Smirnov (KS) statistical test to verify whether anarchy is consistent with all neutrino data, including the new result presented by KamLAND. We find a KS probability for Nature's choice of mixing angles equal to 64%, quite consistent with the anarchical hypothesis. In turn, assuming that anarchy is indeed correct, we compute lower bounds on vertical bar U e3 vertical bar 2 , the remaining unknown 'angle' of the leptonic mixing matrix
Statistical finite element analysis.
Khalaji, Iman; Rahemifar, Kaamran; Samani, Abbas
2008-01-01
A novel technique is introduced for tissue deformation and stress analysis. Compared to the conventional Finite Element method, this technique is orders of magnitude faster and yet still very accurate. The proposed technique uses preprocessed data obtained from FE analyses of a number of similar objects in a Statistical Shape Model framework as described below. This technique takes advantage of the fact that the body organs have limited variability, especially in terms of their geometry. As such, it is well suited for calculating tissue displacements of body organs. The proposed technique can be applied in many biomedical applications such as image guided surgery, or virtual reality environment development where tissue behavior is simulated for training purposes.
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Statistical physics ""Beyond equilibrium
Energy Technology Data Exchange (ETDEWEB)
Ecke, Robert E [Los Alamos National Laboratory
2009-01-01
The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.
Statistical and quantitative research
International Nuclear Information System (INIS)
Anon.
1984-01-01
Environmental impacts may escape detection if the statistical tests used to analyze data from field studies are inadequate or the field design is not appropriate. To alleviate this problem, PNL scientists are doing theoretical research which will provide the basis for new sampling schemes or better methods to analyze and present data. Such efforts have resulted in recommendations about the optimal size of study plots, sampling intensity, field replication, and program duration. Costs associated with any of these factors can be substantial if, for example, attention is not paid to the adequacy of a sampling scheme. In the study of dynamics of large-mammal populations, the findings are sometimes surprising. For example, the survival of a grizzly bear population may hinge on the loss of one or two adult females per year
International Nuclear Information System (INIS)
Arnold, V.I.
2006-03-01
To describe the topological structure of a real smooth function one associates to it the graph, formed by the topological variety, whose points are the connected components of the level hypersurface of the function. For a Morse function, such a graph is a tree. Generically, it has T triple vertices, T + 2 endpoints, 2T + 2 vertices and 2T + 1 arrows. The main goal of the present paper is to study the statistics of the graphs, corresponding to T triple points: what is the growth rate of the number φ(T) of different graphs? Which part of these graphs is representable by the polynomial functions of corresponding degree? A generic polynomial of degree n has at most (n - 1) 2 critical points on R 2 , corresponding to 2T + 2 = (n - 1) 2 + 1, that is to T = 2k(k - 1) saddle-points for degree n = 2k
Statistical physics of vaccination
Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-12-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.
Energy Technology Data Exchange (ETDEWEB)
NONE
2005-07-01
The 2003 reporting to the ISAG comprises 403 plants owned by 273 enterprises. In 2002, reports covered 407 plants owned by 296 enterprises. Waste generation in 2003 is compared to targets from 2008 in the government's Waste Strategy 2005-2008. The following can be said to summarise waste generation in 2003: 1) In 2003, total reported waste arisings amounted to 12,835,000 tonnes, which is 270,000 tonnes, or 2 per cent, less than in 2002. 2) If amounts of residues from coal-fired power plants are excluded from statistics, waste arisings in 2003 were 11,597,000 tonnes, which is a 2 per cent increase from 2002. 3) If amounts of residues from coal-fired power plants and waste from the building and construction sector are excluded from statistics, total waste generation in 2003 amounted to 7,814,000 tonnes, which is 19,000 tonnes, or 1 per cent, less than in 2002. In other words, there has been a fall in total waste arisings, if residues and waste from building and construction are excluded. 4) The overall rate of recycling amounted to 66 per cent, which is one percentage point above the overall recycling target of 65 per cent for 2008. In 2002 the total rate of recycling was 64 per cent. 5) The total amount of waste led to incineration amounted to 26 per cent, plus an additional 1 per cent left in temporary storage to be incinerated at a later time. The 2008 target for incineration is 26 per cent. These are the same percentage figures as applied to incineration and storage in 2002. 6) The total amount of waste led to landfills amounted to 8 per cent, which is one percentage point below the overall landfill target of a maximum of 9 per cent landfilling in 2008. In 2002, 9 per cent was led to landfill. 7) The targets for treatment of waste from individual sectors are still not being met: too little waste from households and the service sector is being recycled, and too much waste from industry is being led to landfill. (au)